It’s becoming increasingly more common for partners to deploy large files—whether it’s to deploy software updates, do OS upgrades, or help a customer with a project. The need to transfer large files is more important than ever and will continue to grow as applications and updates grow.
Amazon Web Services (AWS) has a fairly inexpensive offering called Simple Storage Service (S3), which allows people to upload files and then download them via a direct URL.
Imagine a managed services provider (MSP) wants to migrate one of their customers to Microsoft Office 365. This means downloading a file that is 1.9GB for the 64-bit version, or around 1.7GB for the 32-bit version. Since some computers likely still require the 32-bit version, both installation files have to be available to the MSP.
The MSP could install them manually, however, as we know, deploying Office 365 to 100 computers can take days to do. Alternatively, they could do it automatically in a matter of minutes using an automation engine like SolarWinds® Automation Manager.
In this example, I’ll focus on the AWS part of the example. In the next article, I’ll dig deeper into the Office 365 deployment process, as this is frequently requested.
If you don’t have an AWS account, it’s pretty simple and inexpensive to get. You’ll need to register for it and provide a credit card for billing. AWS S3 charges a storage fee per GB/month (charged hourly), plus a minimal fee per request for each download or upload.
Following the AWS web portal, the files are uploaded and a unique file link is generated.
Once that’s done, you’ll have access to files anywhere in the world without impacting your own bandwidth on either your RMM, your internal FTP/sFTP server, or other file exchange mechanism.
In Automation Manager, you can do one of two things based on your needs:
- You can use the “Download file from URL” object to download it locally to a temporary location, use the “Extract Compressed File” object to unzip it, and run the program.
- Alternatively, you can use the “Download file from Amazon” object, which will require the bucket name, file key, path to download to, access key, and secret key. This is useful for licensed software, license keys, or any other confidential information that you have to pass to a client’s computer.
In today’s article, I covered using AWS, but keep in mind that you can also use Azure or Google Cloud to share files with your customers for your project work or software deployments.
Automation of the Week
This week’s automation policy was created by Johnny Walker from Spinen. He submitted this winning policy to the 2019 North American Automation Hackathon.
The policy is quite simple. It searches Active Directory and disables users who haven’t logged into it in a certain number of days. This can help bolster security in your customers’ networks. Often customers will manage their own users, however sometimes employees who leave are not disabled or deleted. This incurs costs and security risks.
This policy goes through the system, finds users that haven’t logged into Active Directory in the last ‘x’ days (this is configurable), and disables them. Of course, since system users don’t usually log in, you can exclude them via an exclude list.
See the article on the policy and the link to the policy here : https://success.solarwindsmsp.com/kb/solarwinds_n-central/Disable-Inactive-Users
If you’ve created an automation policy and would like to share it with the community, please send it my way by emailing me at [email protected].
As always, don’t forget to check out the Automation Cookbook at www.solarwindsmsp.com/cookbook if you’re interested in other automation policies, script checks, and custom services.