Programmatic AirPrint in the Enterprise @ University of Utah MacAdmins

On Feb 16, 2022, I gave a short presentation at the University of Utah MacAdmins meeting to share the solution that I’d come up with for installing printers on Mac programmatically similar to what is done with AirPrint. Albeit, without needing to enable AirPrint on the printers themselves.

While this more or less covers what I’d written on my blog years ago around the topic (Configuring Printers Programmatically for AirPrint & Configuring Printers Programmatically for AirPrint Part 2: Now with Icons!), the presentation goes further to talk about the upcoming deprecation of .ppd files and also some of the underlying standards and protocols that make this work.

Here is a link to the presentation recording: Recording

Here is a link to the presentation slides: Slides

I always appreciate feedback (I would not call myself any sort of printing expert), so anything that’s not clear or you have questions about, please feel free to let me know in the comments.

Gathering GitHub Release Versions without Assets with autopkg

A while ago, I’d run into an uncommon scenario where a piece of software I was attempting to incorporate into an autopkg workflow (https://github.com/TigerVNC/tigervnc) which had its project on GitHub did not include downloadable assets for the software with these releases. Instead, these were hosted on an entirely different website (Sourceforge). Normally, using the GitHubReleasesInfoProvider processor you would just provide the org and project information, but the processor expects at least one published asset and without this it produces an error.

Initially I’d filed an autopkg issue to incorporate an additional variable which doesn’t require an asset, but it proved easier to simply use the URLTextSearcher processor instead to acquire the necessary version information.

While the versioning scheme of software may vary, the below example & regex (re_pattern) will work for the majority of software on GitHub including those that use semantic versioning (ex. releases: v10.2.15.0.30):

<dict>
    <key>Comment</key>
    <string>Get latest release version number from GitHub</string>
    <key>Arguments</key>
    <dict>
        <key>re_pattern</key>
        <string>/releases/tag/v?([\d.]+)</string>
        <key>result_output_var_name</key>
        <string>version</string>
        <key>url</key>
        <string>https://github.com/<org>/<project>/releases/latest</string>
    </dict>
    <key>Processor</key>
    <string>URLTextSearcher</string>
</dict>

If you find yourself in a similar situation, you can find more information on the autopkg wiki page I created: https://github.com/autopkg/autopkg/wiki/GitHub-Releases-with-No-Assets

Triggering Automated Asset Check-In Alerts for Snipe-IT

While I had written a post some time ago about setting up multiple Snipe-IT instances, I hadn’t actually begun the process of importing our various disparate spreadsheet data and putting the final touches on our inventory configuration until recently.

After importing our regularly loaned assets, I found that after checking out assets (so long as you have Email Alerts enabled) that overdue assets will generate a compiled email every day at midnight.

While the alert is immensely helpful for quickly determining who we need to follow up with, because I had configured a distribution group to receive email alerts this meant all of my colleagues and I were receiving this same late night email, and on weekends.

Thankfully, the Snipe-IT docs had some helpful insight on manually triggering these alerts: https://snipe-it.readme.io/docs/email-alerts

Interestingly, as of this writing it states that self-hosted instances of Snipe-IT need to setup a cronjob to trigger these alerts. While I had written another post on configuring automated backups, these backups were staggered for each instance, so the consistent, unprompted midnight alert appears to be at odds with the docs.

That being said, the docs also refer to manually triggering various alerts (https://snipe-it.readme.io/docs/configuring-alerts-backups) as well as an overview of all available alerts (https://snipe-it.readme.io/docs/notifications-overview).

As it turns out, you can opt to disable email alerts entirely (Settings>Notifications>Alerts) to avoid the default scheduled email alerts, but still generate desired email alerts by manually triggering them via cronjob.

First, disable email alerts by unchecking the Email Alerts Enabled checkbox. This will prevent the daily midnight alerts.

With that done, remotely connect to your Snipe-IT host and add something like the following to your crontab file:

# Snipe-IT expected check-in alert
30 7 * * 1-5    /usr/bin/docker exec <container_name> /usr/bin/php artisan snipeit:expected-checkin
  • 30 7 * * = sends an email at 7:30 am
  • 1-5 = only on weekdays
  • /usr/bin/docker exec <container_name> /usr/bin/php artisan snipeit:expected-checkin = has Docker trigger an expected-checkin alert email to the specified Snipe-IT container

Repeat as necessary for any additional instances.

Now, we’re not being bothered late at night (or on weekends) and have a full list of outstanding assets sent just as our first Helpdesk tech sits down in the morning.

You could also expand this to the other available alerts for triggering at different times & intervals. But if you instead wanted to run everything at once, you could simply replace snipeit:expected-checkin with schedule:run to trigger the built-in scheduler to determine automatically what (if any) of the backup & alert tasks needed to be run.