Introduction
My solution to this problem was to use an Azure Functional App running PowerShell for the file copy, an Azure Blob Container to provide a managed SFTP service that has a minimal configuration overhead, and an Azure File Share configured to only allow our internal systems access.
Subscription
Our current policy is to setup a new Azure subscription to seperate out solutions and keep billing and access easy to manage, so this was first created.
Resource Group
After the subscription was created, the next task was to create an Azure Resource Group which would hold all artifacts related to this process. I also created three storage accounts, one for the Azure Blob Container running the SFTP service, one for the Azure File Services, and the final one for the Azure Functional App.
Virtual Networks
In order to allow the Azure Functional App to access to the two storage accounts, a virtual network was created that would be used in the firewall configuration on each storage account. As we wanted to only allow the Azure File share to be accessed from our internal network, an additional virtual network was created so that we could use a Private Endpoint for use via our expressroute connection to Azure.
Azure Blob Storage Account
The Azure Blob Container needs Hierarchical Namespace enabling, which then gives you the ability to enable the SFTP services.
To secure the SFTP service that would be available on the Internet, under "Security + networking > Networking", the "Public network access" section was modified to restrict access from selected virtual networks and IP addresses.
The external IP address that our partner would be accessing the SFTP service from was added to the Firewall section to allow them restricted access and the virtual network that our Azure Functional App would be using was added to the Virtual networks section.
Azure File Service Storage Account
On the Azure Files Storage account, a new share was created under "Data storage > File shares" and two sub-folders created, "ToSUPPLIER" and "ToSYSTEM". Three Active Directory groups were created, "SystemAFSContributor", "SystemAFSElevated", and "SystemAFSRead". These groups were given the role assignment of "Storage File Data SMB Share Contributor" for the SystemAFSContributor group, "Storage File Data SMB Share Elevated Contributor" for the SystemAFSElevated group, and "Storage File Data SMB Share Reader" for the SystemAFSRead group. I also configured the NTFS permissions for Full Control, Modify and Read to the relevant groups.
To ensure that the Azure File Share is only accessible from our internal network and the virtual network for the Azure Function App, under "Security + networking > Networking", the "Public network access" section was modified to restrict access from selected virtual networks and IP addresses, but this time only the Virtual network section was modified to include the virtual network of the functional app.
We then created a Private endpoint connection using the second virtual network, this virtual network would be acessible from our internal network. To ensure that our internal clients would only try and connect via our expressroute link to Azure, we created a DNS zone for the FQDN of the Private endpoint, giving it the IP address we had assigned. When a Private ednpoint is created, it normally is given the FQDN of <StorageAccountName>.file.core.windows.net which is resolvable as an Internet facing IP address for Azure. In my case I don't want this to happen as i'm not allowing connections from the Internet to the Azure File Share, I only want the traffic to come via our internal network and our expressroute connection.
Azure Function App Storage Account
My Azure Function App would need a storage account, and again the default setting for a storage account's firewall is "Enabled from all networks". From a security standpoint, this is not what I required, so I again changed the setting to "Enabled from selected virtual networks and IP addresses", then added the virtual network for the Function App.
Azure Function App
When creating a Function App you can choose the Runtime stack that you will use, which in my case is PowerShell Core. You also need to specify the Hosting plan that your app will use from three possible options, Consumption (Serverless), Functions Premium, and App service plan. You can find out more information as to what each of these options offers here. In my use case I needed the ability to use Virtual networks, so had to choose the Premium plan.
Once my Function App was deployed, I needed to modify a few settings. The first is under "Settings > Identity", I enabled "System assigned" identity and assigned it the role of "Storage Account Contributor" at the resource group level. By giving the Function App this role, it was granted the required permission I needed for being able to copy files and blobs between the storage accounts. The next setting I changed is under "Settings > Configuration > General settings", here I changed the Platform to 64 Bit. The last setting that I changed was under "Settings > Networking", here I enabled VNet integration and selected the virtual network I had created for the Function App.
PowerShell Script
The PowerShell script that I have written can be obtained from my Github repository. It has fourteen variables that need to be configured before use, examples of which can be seen below.
$AzureSubscriptionName = "mysubscription-001"$resourceGroup = "rg-myapp-ukwest-001"$storageAccountBlobName = "sablobftp"$storageAccountFileName = "safileaccount"$storageContainerName = "my-container"$storageFileShareName = "my-file-share"$srcExportFolder = "ToSUPPLIER"$srcExportFile = "SupplierExport.csv"$dstExportFolder = "ToSUPPLIER"$dstExportFile = "SupplierExport.csv"$srcImportFolder = "ToSYSTEM"$srcImportFile = "SYSTEMImport.csv"$dstImportFolder = "ToSYSTEM"$dstImportFile = "SYSTEMImport.csv"
When the script runs, it first looks in each of the sub-folders and tries to get the last modified date of the files "SupplierExport.csv" and "SYSTEMImport.csv".
For the SFTP area, the "ToSYSTEM" sub-folder is the source folder that the partner would upload a file to. If this file date is newer than the one in the "ToSYSTEM" folder under the File Share, then it is replaced.
For the File Share area, the "ToSUPPLIER" sub-folder is the source folder that our internal systems would upload a file to. If this file date is newer than the one in the "ToSUPPLIER" folder under the SFTP Container area, then it is replaced so that the partner can pick up an export from us.
When running the script either for testing or as part of the schedule, you will want to enable Application Insights to get logging information when it runs.
Hopefully this will help others wanting to setup something similar in Azure.
No comments:
Post a Comment