HARMAN ANALYTICS : Promotion Effectiveness
This collection of experiment demonstrates a model to estimate sales lift due to promotion in retail stores across various categories
**Promotion Effectiveness Template**
Sales promotion is frequently used in grocery retailing. It takes up a large share of the resources
grocery retailers spend on marketing. The heavy usage of sales promotion has recently become questioned by retailers and manufacturers. Despite the large sums yearly spent on sales promotion, the economics of sales promotion is poorly understood.. This Sales Lift Estimation Template focuses on estimating the lift attained as well as provides a method to predict the sales lift for a future promotion. As part of the Azure Machine Learning offering, Harman is providing this template to category managers to expedite their sales lift calculations. This template provides pre-configured machine learning modules along with custom R scripts in the Execute R Script Module.
The overall template is divided into 4 major segments with each containing one of the following steps:
![Steps][1]
**Here are the links to each step (experiment) of the template:**
**1. Promotion Effectiveness Template: Step 1 of 4, Data Import**
**2. Promotion Effectiveness Template: Step 2 of 4, Baseline Estimation**
**3. Promotion Effectiveness Template: Step 3 of 4, Sales Lift Estimation**
**4. Promotion Effectiveness Template: Step 4 of 4, Lift Prediction**
**Step 1: Data Import**
Details of this experiment can be seen in the following process flow:
![Proces flow][2]
Promotional effectiveness template have a input file as RawInputdata in csv format
Schema of Input data is shown in the following table:
![Datafields][3]
Data Input
1.0 Input data: The reader reads the input data from the file RawInputdata.csv. Data can be seen using right click on input data node to visualize
![Input data][4]
Connect Project columns node with Input data to select relevant column
![Project column][5]
Add Descriptive Statistics node with input data to understand the data distribution
![Descriptive statistics][6]
1.1 Drag R script node to calculate wk_mean_value after aggregating value at store, brand & wk
1.2 Calculate total mean_value after aggregating value at store & brand
![TotalmeanValue][7]
1.3 Calculate seasonal index by using the formula
Seasonal Index=wk_mean_value/total_mean_value
1.4 Calculate deseasonalised value by using the formula given below after mapping seasonal
index with input data file
Deseasonalised value=value/Seasonal Index
![Deseasonalizedvalue][8]
**Step 2: Baseline Estimation**
2.0 Drag a new R script node to create a promo flag as max_promo. When promotions has taken place, flag it as 1 else 0
2.1 Create two lag variables for store and brand and an another unique flag by combination store & brand to create a sequence for calculating lag of deseasonalised value
![Two lags][9]
2.2 Calculate deseasonalised value for lag of store and brand combination
2.3 Create four independent data frame by calculating base line value based on four conditions
2.3.0 If max_Promo is 0 & seq=1, baseline is defined as
(2/3) deseasonalised value+ (1/3) lag_deseasonalised value
2.3.1 If max_promo is 0 & seq>1, baseline is flagged as 9999
2.3.2 If max_Promo is 1 & seq=1, baseline is defined as
Deseasonalised value
2.3.3 If max_Promo is 1 & seq>1, baseline is defined as
Lag of deseasonalised value
2.4 Append all the four table and order by store, BF & week
2.5 If max_promo is 0, keep incremental value as 0 else incremental value is value minus baseline
2.6 If any incremental value is less than 0, replace its value with 0.
2.7 Baseline output in the below snap shot
![Baseline value][10]
**Step 3: Sales Lift Estimation**
3.0.0 Drag new R Script node to melt the data in order to visualize the sales lift for brand by Category for stores.
Use Stores, Category & Brand as ID and LIFT as measure variables
3.0.1 Finally calculate sales lift using formula
Sales Lift=Incremental value/ (value+ Incremental value)
![Sales lift est][11]
3.0.2 Filter Lift > 10 percent to be used in visualization chart
3.0.3. Using ggplot2 package, looped all stores level visualization of sales LIFT by category, brand & by brand, promotions
![Lift output][12]
3.1.0 Save the Promotional Lift output using Writer node for further analysis
3.1.1 Insert Writer node in Azure ML process flow & connect it with earlier node
3.1.2 In writer node properties, fill the Blob storage path, user account name and account key
![writer node][13]
**Steps 4:- Lift Prediction**
4.0.0 Insert Project Column node to select relevant column for Demand Lift Prediction
![Newcolumn][14]
4.0.1 Insert Metadata Editor node to rename variable
![Newmetadata][15]
4.0.1 Insert Split node to partition data into train & test
![Data Partition][16]
4.1.0 Insert Linear Regression node to predict demand lift
4.1.1 Insert Train node to specific target variable as Lift
![Reset Target variable][17]
4.1.2. Connect Train output & Split output with Score Model node to score predicted lift
4.1.3. Insert Evaluate Model to check model performance
![Model performance][18]
4.1.4 Connect Web service input & web service output for web deployment
![web service][19]
4.1.5. Once the experiment runs successfully, it is ready for web deployment
![Web deployment][20]
4.1.6. After clicking deploy web service, it will ask for overwrite the output. Click ok to move ahead
4.1.7 Click TEST to enter data for API
![Test][21]
4.1.8 Enter data and run
![Enter data][22]
4.1.9 The predicted demand lift output will be generated
4.2.0 Power BI has been connected with Azure Blob Storage. The output was saved in Blob storage. Lift visualization by Promotions & Brand across different stores in Power BI
![power BI][23]
[1]: https://dl.dropboxusercontent.com/s/vmh7cokboct3ygl/Steps1.jpg?dl=0
[2]: https://dl.dropboxusercontent.com/s/fmsa58fm2su018t/Process%20flownew.jpg?dl=0
[3]: https://dl.dropboxusercontent.com/s/vqzzm6c8w4jbmyu/DataFields.jpg?dl=0
[4]: https://dl.dropboxusercontent.com/s/1q49afr70n119ww/Inputdata.jpg?dl=0
[5]: https://dl.dropboxusercontent.com/s/y7i0zbusyusjpxh/Coulmns%20selected1.jpg?dl=0
[6]: https://dl.dropboxusercontent.com/s/ir7ib9bcsno2eze/Descriptive%20statistcs.jpg?dl=0
[7]: https://dl.dropboxusercontent.com/s/zhbxv0j893i997o/Totalmeanvalue.jpg?dl=0
[8]: https://dl.dropboxusercontent.com/s/u0295mbnrb6n94q/deseasonalisedvalue.jpg?dl=0
[9]: https://dl.dropboxusercontent.com/s/s1z136ubpm7l7py/twolags.jpg?dl=0
[10]: https://dl.dropboxusercontent.com/s/tot8dsy8ezv677m/Baseline%20value.jpg?dl=0
[11]: https://dl.dropboxusercontent.com/s/ynp9wb0n6zhgpnv/SalesLiftEstimation.jpg?dl=0
[12]: https://dl.dropboxusercontent.com/s/b3nqmv7vda8qp73/LiftVisualization.jpg?dl=0
[13]: https://dl.dropboxusercontent.com/s/cvkqz6evgzun279/Writernode.jpg?dl=0
[14]: https://dl.dropboxusercontent.com/s/1kw12od0r7limyf/Newselectcolumn2.jpg?dl=0
[15]: https://dl.dropboxusercontent.com/s/ohcvx8zom69zdv8/NewMetadata.jpg?dl=0
[16]: https://dl.dropboxusercontent.com/s/feel6tw0lax32nc/Split.jpg?dl=0
[17]: https://dl.dropboxusercontent.com/s/ntvbxwc8k7tz7nv/Trainnode.jpg?dl=0
[18]: https://dl.dropboxusercontent.com/s/pvgdjdbny4zo3kp/Evaluatenode.jpg?dl=0
[19]: https://dl.dropboxusercontent.com/s/31l388mjwikobni/Webservices.jpg?dl=0
[20]: https://dl.dropboxusercontent.com/s/qp7y9rure8d2pm8/deploy.jpg?dl=0
[21]: https://dl.dropboxusercontent.com/s/96kr98ypgr1b26o/Test.jpg?dl=0
[22]: https://dl.dropboxusercontent.com/s/b32nkvjzr4fchw6/Enterdata.jpg?dl=0
[23]: https://dl.dropboxusercontent.com/s/sxn1thqsjtwxv13/PBI.jpg?dl=0