Skip to main content

Azure Batch Service - Overview (part 1)

In the last few days a new Azure service catch my eyes. Unfortunately only today I had time to write about it.
Azure Batch Service is the today star. Why? Because enables us to run tasks over Azure infrastructure without having to think about scaling, parallel and performance problems. You only need to write the batch logic, upload to Azure and everything else will be managed by Azure Batch Service.
On top this, we have a scheduling and auto scaling mechanism that enable us to have more granular control to compute resources. The API give us the possibility to publish and run batch applications without thinking about resources, service management and so on. Also, Batch Service provide us full access to configuration, giving us the possibility to manage all the resources and task execution pipeline. In this case you need to manage how content is moved, persisted and so on.

What you can run? 
There are two types of formats that can be used in this moment. An classic .exe file that represents you batch application or a batch application (that is represented by a collection of assemblies, that implements specific interfaces – we will talk about this subject later on).

What is a TVM?
Task Virtual Machine (TVM) is an Azure VM where your tasks will run. All your workload created by your tasks will run on one or more dedicated TVM. User has the possibility to define a pool of TVM that will be managed by Azure.
When you create a pool of TVM you have the possibility to specify different attributes like size, OS, pool size, scaling policy, certificates and many more. In one world you have full control of what kind of TVM are used for batch tasks. And yes, each TVM has his own IP and name.

What is a work item?
Work item is the template that specify how the application will run on TVM. It is the interfaced used by developers to control how a batch application runs.

What is a job?
A job is an instance of a work items that runs on TVM.

What is a task?
A task is the logic that is executed on TVM. One or more tasks define a job.

Do I need to manage all of them (work items, jobs and tasks)?
No. If you are adding an existing application and you only want to run in over Batch infrastructure, than the jobs, splitting into tasks, merging result and many more will be full managed by Batch Services.
On the other hand, if you want a full control of Batch, that you will need to manage and implement all of them.

What is a Merge Task?
It is the step when the results from each task is merged by the job to create the final result.

What is a file?
File represents the input data used by jobs. Based on your logic a task can use one or more files.

What is a directory?
Directory is the location under each task that is used to run the program and other temporary files. All file and directories under a job, can be accessed by the tasks of that job.

Where I need to store all this files?
All the files needs to be stored under the storage account of the subscription that is used by Batch Service.

How I trigger a batch action?
In general a batch by submitting a job. For input data, the files will be used that are related to that job. This can be done over REST API very easily.

Can I monitor a job?
Yes, when a job runs, you can monitor it (progress status) and access the output data at the end.

In the next post we will look over the code and discover how we can work with Azure Batch Service.

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(...

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills)

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills) The idea that moving to the cloud reduces the costs is a common misconception. The cloud infrastructure provides flexibility, scalability, and better CAPEX, but it does not guarantee lower costs without proper optimisation and management of the cloud services and infrastructure. Idle and unused resources, overprovisioning, oversize databases, and unnecessary data transfer can increase running costs. The regional pricing mode, multi-cloud complexity, and cost variety add extra complexity to the cost function. Cloud adoption without a cost governance strategy can result in unexpected expenses. Improper usage, combined with a pay-as-you-go model, can result in a nightmare for business stakeholders who cannot track and manage the monthly costs. Cloud-native services such as AI services, managed databases, and analytics platforms are powerful, provide out-of-the-shelve capabilities, and increase business agility and innovation. H...