No-code Batch Scoring
The Azure Machine Learning connector for Logic Apps enables the scheduled running of Batch jobs without having to write any code.
# Use an Azure Logic App to Schedule Running Batch Jobs #
Azure Logic Apps enable the calling of Azure Machine Learning APIs to perform batch scoring using the Machine Learning Batch Execution Service. You can use Logic Apps to score both New and Classic web services.

For more information on Logic Apps, see [What are Logic Apps?](https://azure.microsoft.com/en-us/documentation/articles/app-service-logic-what-are-logic-apps/)
The overall process of creating a batch scoring job using a Logic App is:
1. Create an Experiment.
2. Deploy it as a predictive Web service.
4. Create and deploy a Logic APP.
3. Update the Logic APP with the Batch Execution Service API URL and API Key for the predictive Web service from the [Azure Machine Learning Web Services](https://services.azureml.net) portal.
4. Run the Logic App.
This article covers steps three through five.
## Create a Logic App for Batch Scoring ##
Use the *Batch Job with Input and Output module* to set up scheduled retraining of your machine learning model.
You can also create a Logic App to retrain the model and update your Predictive Web service. For more information, see [Use an Azure Logic App to Schedule Retraining Models]().
### Prerequisites
Before you create your Batch Scoring Logic App, you must:
* Create an experiment in Azure Machine Learning Studio.
* Deploy the experiment as a Web service.
For more information on deploying a Web service, see [Walkthrough Step 5: Deploy the Azure Machine Learning Web service](https://azure.microsoft.com/en-us/documentation/articles/machine-learning-walkthrough-5-publish-web-service/).
### Deploy the Logic App
1. Sign in to the Azure portal ([https://portal.azure.com](https://portal.azure.com))
1. On the hub menu, click **New > Web+Mobile > Logic App**.
1. Enter a name for the Logic App.
2. Choose a subscription.
3. Choose a resource group.
4. Choose a location.
2. Click **Create**.

### Build the Batch Scoring logic app ###
On the dashboard, select your Logic App. You can search for it by clicking the **See More** at the bottom of **All Resources**. The Logic App Designer opens.
To complete the task, you will:
- Create a Logic App and deploy it.
- Add a *Batch Job with Input and Output* module.
- Update the job module with:
- Your storage account information.
- The name of the blob that contains the input data.
- the name of the blob to receive the output,
- The API key and the Batch Request URL from your Predictive Web service.
To add modules the Logic App:
1. In the **Logic Apps Designer**, click **Blank LogicApp**.
1. In the list of triggers, click **Recurrence**. Q: Do we have recommendations on frequency. Also, I did not see a way to set it for manual runs.
The Logic App Designer provides the following four options for running Batch Execution (BES) jobs:
* Batch Job With Input and Output: Experiment has web service input and output modules
* Batch Job No Input and Output: Experiment does not have web service input or output module (e.g. uses Reader and Writer modules)
* Batch Job With only Input: Experiment has a web service input module, but no web service output module (e.g. uses a Writer module)
* Batch Job With only Output: Experiment has no web service input module, but has a web service output module (e.g. uses a Reader module) Note that BES is an asynchronous request and could take time to complete depending on the size of your data and the complexity of the model. When the job is completed, the Connector will return the output result.
The following example assumes an API with input and output.
1. Click **New Step**, then Add an Action.
1. In Search, type *Azure ML*.
1. Select **Azure ML - Batch Job with Input and Output**.

Since the Studio Experiment has Web service input and output modules, you need to provide the Container name, Storage account name, and the Storage account key for the input and output files. Q: can this be a new storage account or does it need to be a classic storage account?
In the Services, select Storage accounts. If it is not visible, you can click More services and search for it.
Select or create a storage account:
- Enter the name of the storage account in **Storage Account Name (Input)** and **Storage Account Name (Output)**.
- Enter the key in **Storage Account Key (Input)** and **Storage Account Key (Output)**.
- Enter container name in the **Storage Container Name (Input)** and **Storage Container Name (Output)** fields.
**Note**: You can use the different storage accounts and containers for the input and output. You can also choose not to specify the storage information for the output. In that case, the storage account and storage container specified for the input is used along with a default name for the file.
Next, you need the Web Service Batch Execution Service API URL and API Key.
1. Sign in to the [Azure Machine Learning Web Services](https://services.azureml.net) portal.
1. Click **Classic Web Services**.
2. Click your Web service.
3. Click the default endpoint.
2. Click **Consume**.
3. From the Consume page, copy the **Primary Key** to the **API Key** field in the Logic App. Copy the **Batch Requests** URL to the **API POST URL** field.
4. Enter the name of the blob that contains the data for the input in **Blob Name (Input)**.
5. Enter the name of the blob to receive the output in **Blob Name (Output)**.

The completed module should look similar to the following:

In addition, you can include Global (Web service) Parameters if your experiment includes them. For more information on Web service parameters, see [Use Azure Machine Learning Web Service Parameters](machine-learning-web-service-parameters.md).
To add Web service parameters, click **Show Advanced Options**.

In the Global Parameter fields, enter one or more global (Web service) parameters in a comma separated list of fields and values.

Click **Save**.
The job will start running on the schedule you specified.
You can see the status of your job by viewing dashboard for the Logic App.

To view the results, open the storage account and container you specified and download the output file.
You can test the Logic App by opening it and clicking **Run Trigger**.

Other variations on BES jobs, such as a job with no web service input or output, are also available.
## Summary ##
Using Azure Logic Apps, you can run batch scoring jobs that are executed on demand or on a recurring schedule without the need to write any code.