Ecometry® Business Intelligence and Reporting Analysis Part 2

by Brennan Mann 23. October 2010 04:10

The next step towards Business Intelligence is populating your data warehouse. If you currently have a data warehouse, you are probably using very complicated DDL scripts, stored procedures, dangerous database triggers and collection tables that pollute your database.  If you are using database triggers, you are in bad shape.  Having a trigger fail to fire can cause potential data loss. 

If you are using triggers, make sure you have error handling in place.

1.)    Try and Catch Block in your trigger.

BEGIN TRY
     { sql_statement | statement_block }
END TRY
BEGIN CATCH
          [ { sql_statement | statement_block } ]
END CATCH
[ ; ]


2.)    Be aware of your transaction scope. In MS SQL, you can access @@TRANCOUNT (Returns the number of BEGIN TRANSACTION statements that have occurred on the current connection).  

The BEGIN TRANSACTION statement increments @@TRANCOUNT by 1. ROLLBACK TRANSACTION decrements @@TRANCOUNT to 0, except for ROLLBACK TRANSACTION savepoint_name, which does not affect @@TRANCOUNT. COMMIT TRANSACTION or COMMIT WORK decrement @@TRANCOUNT by 1.

3.)    Don’t use triggers


“Making triggers is one of the more invasive things you can do in a system. It is part of the database schema for enforcing database behavior and you are doing this in someone else's database. They have their place, but I don't think this usage is appropriate IF you can't ensure that they behave well. Frankly if you are using a trigger as your only mechanism to ship changes, if the trigger was to fail silently, your changes wouldn't get shipped, yet the transaction could complete and you would lose the change data "message", how would you re-sync your data reliably anyway?”


Instead of using database triggers, we will be using SQL Replication for data movement from your Ecometry database server to your reporting database server. With the introduction of the MERGE SQL command, developers can more effectively handle common data warehousing scenarios, like checking whether a row exists, and then executing an insert or update or delete. The MERGE statement basically merges data from a source result set to a target table based on a condition that you specify and if the data from the source already exists in the target or not. The new SQL command combines the sequence of conditional INSERT, UPDATE and DELETE commands in a single atomic statement, depending on the existence of a record.
One of the most important advantages of MERGE statement is that the entire data are read and processed only once. In earlier versions, three different statements had to be written to process three different activities (INSERT, UPDATE or DELETE); however, by using MERGE statement, all the update activities can be done in one pass of database table.


What we will be doing is normalizing our data and storing into a “FACT” table for reporting.  Remember from my initial post “Fact table contains the basic information that you wish to summarize. This might be order detail information, payroll records or anything else that's amenable to summing and averaging. Any table that you've used with a Sum or Avg function in a totals query is a good bet to be a fact table”.  We will create a number of FACT tables for different areas of business analysis.  Below is example of normalizing you Ecometry data using the SQL MERGE command:

use DataWarehouse
-- Line Item 001
MERGE dbo.FACTTABLE AS targetTable
USING ( SELECT DISTINCT dbo.ORDERSUBHEAD.FULLORDERNO,
               dbo.ORDERSUBHEAD.COMPANY,
               dbo.ORDERSUBHEAD.DIVISION,
               dbo.ORDERSUBHEAD.SHIPMETHOD,
               dbo.ORDERSUBHEAD.EDPNOS_001,
               dbo.ORDERSUBHEAD.LINENOS_001,
               dbo.ORDERSUBHEAD.ITEMQTYS_001,
               dbo.ORDERSUBHEAD.EXTPRICES_001,
               dbo.ORDERSUBHEAD.EXTCOSTS_001,
               dbo.ORDERSUBHEAD.BIGSTATUS,
               dbo.ORDERHEADER.CUSTEDP,
               dbo.ORDERHEADER.SOURCE,
               dbo.ORDERHEADER.PAYMETHOD,
               dbo.ORDERHEADER.MAILDATE,
               dbo.ORDERHEADER.ENTRYDATE            
               FROM dbo.ORDERSUBHEAD
               JOIN dbo.ORDERHEADER
               ON dbo.ORDERSUBHEAD.ORDERNO + '0000' = dbo.ORDERHEADER.FULLORDERNO )AS sourceTable
ON   targetTable.FULLORDERNO = sourceTable.FULLORDERNO
    AND targetTable.LINENUMBER = sourceTable.LINENOS_001
WHEN NOT MATCHED THEN INSERT
    ( CUSTEDP, FULLORDERNO, COMPANY, DIVISION, SOURCE, OFFER, USECODE, MAILDATE, ENTRYDATE, ENTRYTIME, PAYMETHOD, SHIPMETHOD,
    ITEMEDPNO, LINENUMBER, QTY, EXTPRICE, EXTCOST, STATUS )
    VALUES
    (
        sourceTable.CUSTEDP,
        sourceTable.FULLORDERNO,
        sourceTable.COMPANY,
        sourceTable.DIVISION,
        sourceTable.SOURCE,
        SUBSTRING(sourceTable.SOURCE,1,8),
        SUBSTRING(sourceTable.SOURCE,9,8),
        CONVERT(datetime, convert(char(8), sourceTable.MAILDATE)  ),
        CONVERT(datetime, convert(char(8), sourceTable.ENTRYDATE) ),
        CONVERT(datetime, convert(char(8), sourceTable.ENTRYDATE) ),     
        sourceTable.PAYMETHOD,
        sourceTable.SHIPMETHOD,
        sourceTable.EDPNOS_001,
        sourceTable.LINENOS_001,
        sourceTable.ITEMQTYS_001,
        convert(money, convert(money, sourceTable.EXTPRICES_001/100) ) , 
        convert(money, convert(money, sourceTable.EXTCOSTS_001/100) ), 
        SUBSTRING(sourceTable.BIGSTATUS,2,1)
     )
WHEN MATCHED THEN UPDATE SET
      targetTable.CUSTEDP      = sourceTable.CUSTEDP,
      targetTable.FULLORDERNO  = sourceTable.FULLORDERNO,
      targetTable.SOURCE       = sourceTable.SOURCE,
      targetTable.PAYMETHOD    = sourceTable.PAYMETHOD,
      targetTable.ITEMEDPNO    = sourceTable.EDPNOS_001,
      targetTable.LINENUMBER   = sourceTable.LINENOS_001,
      targetTable.QTY          = sourceTable.ITEMQTYS_001,
      targetTable.STATUS       = SUBSTRING(sourceTable.BIGSTATUS,2,1),
      targetTable.EXTPRICE     = convert(money, convert(money, sourceTable.EXTPRICES_001/100) ),
      targetTable.EXTCOST      = convert(money, convert(money, sourceTable.EXTCOSTS_001/100) )
;


-- Line Item 002

MERGE dbo.FACTTABLE AS targetTable
USING ( SELECT DISTINCT dbo.ORDERSUBHEAD.FULLORDERNO,
               dbo.ORDERSUBHEAD.COMPANY,
               dbo.ORDERSUBHEAD.DIVISION,
               dbo.ORDERSUBHEAD.SHIPMETHOD,
               dbo.ORDERSUBHEAD.EDPNOS_002,
               dbo.ORDERSUBHEAD.LINENOS_002,
               dbo.ORDERSUBHEAD.ITEMQTYS_002,
               dbo.ORDERSUBHEAD.EXTPRICES_002,
               dbo.ORDERSUBHEAD.EXTCOSTS_002,
               dbo.ORDERSUBHEAD.BIGSTATUS,
               dbo.ORDERHEADER.CUSTEDP,
               dbo.ORDERHEADER.SOURCE,
               dbo.ORDERHEADER.PAYMETHOD,
               dbo.ORDERHEADER.MAILDATE,
               dbo.ORDERHEADER.ENTRYDATE            
               FROM dbo.ORDERSUBHEAD
               JOIN dbo.ORDERHEADER
               ON dbo.ORDERSUBHEAD.ORDERNO + '0000' = dbo.ORDERHEADER.FULLORDERNO )AS sourceTable
ON   targetTable.FULLORDERNO = sourceTable.FULLORDERNO
    AND targetTable.LINENUMBER = sourceTable.LINENOS_002
WHEN NOT MATCHED AND sourceTable.EDPNOS_002 > 0 THEN INSERT
    ( CUSTEDP, FULLORDERNO, COMPANY, DIVISION, SOURCE, OFFER, USECODE, MAILDATE, ENTRYDATE, ENTRYTIME, PAYMETHOD, SHIPMETHOD,
    ITEMEDPNO, LINENUMBER, QTY, EXTPRICE, EXTCOST, STATUS )
    VALUES
    (
        sourceTable.CUSTEDP,
        sourceTable.FULLORDERNO,
        sourceTable.COMPANY,
        sourceTable.DIVISION,
        sourceTable.SOURCE,
        SUBSTRING(sourceTable.SOURCE,1,8),
        SUBSTRING(sourceTable.SOURCE,9,8),
        CONVERT(datetime, convert(char(8), sourceTable.MAILDATE)  ),
        CONVERT(datetime, convert(char(8), sourceTable.ENTRYDATE) ),
        CONVERT(datetime, convert(char(8), sourceTable.ENTRYDATE) ),     
        sourceTable.PAYMETHOD,
        sourceTable.SHIPMETHOD,
        sourceTable.EDPNOS_002,
        sourceTable.LINENOS_002,
        sourceTable.ITEMQTYS_002,
        convert(money, convert(money, sourceTable.EXTPRICES_002/100) ) , 
        convert(money, convert(money, sourceTable.EXTCOSTS_002/100) ), 
        SUBSTRING(sourceTable.BIGSTATUS,2,1)
     )
WHEN MATCHED THEN UPDATE SET
      targetTable.CUSTEDP      = sourceTable.CUSTEDP,
      targetTable.FULLORDERNO  = sourceTable.FULLORDERNO,
      targetTable.SOURCE       = sourceTable.SOURCE,
      targetTable.PAYMETHOD    = sourceTable.PAYMETHOD,
      targetTable.ITEMEDPNO    = sourceTable.EDPNOS_002,
      targetTable.LINENUMBER   = sourceTable.LINENOS_002,
      targetTable.QTY          = sourceTable.ITEMQTYS_002,
      targetTable.STATUS       = SUBSTRING(sourceTable.BIGSTATUS,3,1),
      targetTable.EXTPRICE     = convert(money, convert(money, sourceTable.EXTPRICES_002/100) ),
      targetTable.EXTCOST      = convert(money, convert(money, sourceTable.EXTCOSTS_002/100) )
;


-- Line Item 003

MERGE dbo.FACTTABLE AS targetTable
USING ( SELECT DISTINCT dbo.ORDERSUBHEAD.FULLORDERNO,
               dbo.ORDERSUBHEAD.COMPANY,
               dbo.ORDERSUBHEAD.DIVISION,
               dbo.ORDERSUBHEAD.SHIPMETHOD,
               dbo.ORDERSUBHEAD.EDPNOS_003,
               dbo.ORDERSUBHEAD.LINENOS_003,
               dbo.ORDERSUBHEAD.ITEMQTYS_003,
               dbo.ORDERSUBHEAD.EXTPRICES_003,
               dbo.ORDERSUBHEAD.EXTCOSTS_003,
               dbo.ORDERSUBHEAD.BIGSTATUS,
               dbo.ORDERHEADER.CUSTEDP,
               dbo.ORDERHEADER.SOURCE,
               dbo.ORDERHEADER.PAYMETHOD,
               dbo.ORDERHEADER.MAILDATE,
               dbo.ORDERHEADER.ENTRYDATE            
               FROM dbo.ORDERSUBHEAD
               JOIN dbo.ORDERHEADER
               ON dbo.ORDERSUBHEAD.ORDERNO + '0000' = dbo.ORDERHEADER.FULLORDERNO )AS sourceTable
ON   targetTable.FULLORDERNO = sourceTable.FULLORDERNO
    AND targetTable.LINENUMBER = sourceTable.LINENOS_003
WHEN NOT MATCHED AND sourceTable.EDPNOS_003 > 0 THEN INSERT
    ( CUSTEDP, FULLORDERNO, COMPANY, DIVISION, SOURCE, OFFER, USECODE, MAILDATE, ENTRYDATE, ENTRYTIME, PAYMETHOD, SHIPMETHOD,
    ITEMEDPNO, LINENUMBER, QTY, EXTPRICE, EXTCOST, STATUS )
    VALUES
    (
        sourceTable.CUSTEDP,
        sourceTable.FULLORDERNO,
        sourceTable.COMPANY,
        sourceTable.DIVISION,
        sourceTable.SOURCE,
        SUBSTRING(sourceTable.SOURCE,1,8),
        SUBSTRING(sourceTable.SOURCE,9,8),
        CONVERT(datetime, convert(char(8), sourceTable.MAILDATE)  ),
        CONVERT(datetime, convert(char(8), sourceTable.ENTRYDATE) ),
        CONVERT(datetime, convert(char(8), sourceTable.ENTRYDATE) ),     
        sourceTable.PAYMETHOD,
        sourceTable.SHIPMETHOD,
        sourceTable.EDPNOS_003,
        sourceTable.LINENOS_003,
        sourceTable.ITEMQTYS_003,
        convert(money, convert(money, sourceTable.EXTPRICES_003/100) ) , 
        convert(money, convert(money, sourceTable.EXTCOSTS_003/100) ), 
        SUBSTRING(sourceTable.BIGSTATUS,2,1)
     )
WHEN MATCHED THEN UPDATE SET
      targetTable.CUSTEDP      = sourceTable.CUSTEDP,
      targetTable.FULLORDERNO  = sourceTable.FULLORDERNO,
      targetTable.SOURCE       = sourceTable.SOURCE,
      targetTable.PAYMETHOD    = sourceTable.PAYMETHOD,
      targetTable.ITEMEDPNO    = sourceTable.EDPNOS_003,
      targetTable.LINENUMBER   = sourceTable.LINENOS_003,
      targetTable.QTY          = sourceTable.ITEMQTYS_003,
      targetTable.STATUS       = SUBSTRING(sourceTable.BIGSTATUS,4,1),
      targetTable.EXTPRICE     = convert(money, convert(money, sourceTable.EXTPRICES_003/100) ),
      targetTable.EXTCOST      = convert(money, convert(money, sourceTable.EXTCOSTS_003/100) )
;

 Below is a sample for sales FACTTABLE  - Notice our primary key is the Ecometry FULLORDERNO and LINENUMBER:

 

USE [DataWarehouse]
GO

/****** Object:  Table [dbo].[FACTTABLE]    Script Date: 10/22/2010 14:43:33 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[FACTTABLE](
    [CUSTEDP] [numeric](18, 0) NOT NULL,
    [FULLORDERNO] [char](12) NOT NULL,
    [COMPANY] [char](2) NOT NULL,
    [DIVISION] [char](2) NOT NULL,
    [SOURCE] [char](16) NOT NULL,
    [OFFER] [char](8) NOT NULL,
    [USECODE] [char](8) NOT NULL,
    [MAILDATE] [datetime] NOT NULL,
    [ENTRYDATE] [datetime] NOT NULL,
    [ENTRYTIME] [datetime] NOT NULL,
    [PAYMETHOD] [char](2) NOT NULL,
    [SHIPMETHOD] [char](2) NOT NULL,
    [ITEMEDPNO] [numeric](18, 0) NOT NULL,
    [LINENUMBER] [numeric](18, 0) NOT NULL,
    [QTY] [numeric](18, 0) NOT NULL,
    [EXTPRICE] [money] NOT NULL,
    [EXTCOST] [money] NOT NULL,
    [STATUS] [char](2) NOT NULL,
 CONSTRAINT [PK_FACTTABLE] PRIMARY KEY CLUSTERED
(
    [FULLORDERNO] ASC,
    [LINENUMBER] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

Tags:

Analytics | Ecometry® | Reporting | SQL

Ecometry Business Intelligence and Reporting Analysis

by Brennan Mann 22. August 2010 00:08

Analysis Services is the cornerstone of your business intelligence infrastructure. If you already running SQL Server 2005 or greater, you have all the capabilities of building a data warehouse without having to purchase expensive data modeling software. SQL Server 2005 Analysis Services (SSAS) provides a unified and integrated view of all your business data as the foundation for all of your traditional reporting, online analytical processing (OLAP) analysis, Key Performance Indicator (KPI) scorecards, and data mining.

The data warehouse schema is configured in a concept known as “Star Schemas” The star schema is based on a central fact table with multiple dimensions and measurements. For example, get total sales by state, rep, month for the last two years. This query would hit a 3D data model (aka Star Schema or Data Cube). The presentation layer of the data can be delivered in an Excel format or deployed to SQL 2005/2008 Repoting Server.  This is a dynamic and interactive report. When I say dynamic, Excel actually creates a “data” connection to the SQL Server Analysis Service. Excel Pivot tables and Dashboards can easily be created.

 

 

Screen Shot of a Star Schmea (click to zoom in):

 

 

Fact Table

Fact table A fact table contains the basic information that you wish to summarize. This might be order detail information, payroll records or anything else that's amenable to summing and averaging. Any table that you've used with a Sum or Avg function in a totals query is a good bet to be a fact table.

For this example, our fact table contains the following fields:

 

Column Data Type Comments
CUSTEDP numeric  
FULLORDERNO char(12)  
COMPANY char(2)  
DIVISION char(2)  
SOURCE char(16)  
OFFER char(8)  
USE char(8)  
MAILDATE datetime  
ENTRYDATE datetime  
ENTRYTIME datetime  
PAYMETHOD char(2)  
SHIPMETHOD char(2)  
ITEMEDPNO numeric  
QTY numeric  
LINENUMBER numeric  
EXTPRICE numeric  
EXTCOST numeric  
LINESTATS char(2)  

This table contains normalized Ecometry Data. I will post the SQL scripts shortly. If you would like them now, shot me an email ( brennan@plasmawebservices.com).

Measurements

Every cube will contain one or more measures, each based on a column in a fact table that you'd like to analyze. In the cube of book order information, for example, the measures would be things such as unit sales and profit.

The center table aka the “fact” contains all the measurements In this example, the measurements are extended cost and price.

Dimensions Table

A dimension table contains hierarchical data by which you'd like to summarize. Examples would be an Orders table, that you might group by year, month, week, and day of receipt, or a Books table that you might want to group by genre and title.

Dimensions

Each cube has one or more dimensions, each based on one or more dimension tables. A dimension represents a category for analyzing business data: time or category in the examples above. Typically, a dimension has a natural hierarchy so that lower results can be "rolled up" into higher results. For example, in a geographical level you might have city totals aggregated into state totals, or state totals into country totals.

For this example our data dimensions are:

Drill down on

Customers

company

name

state>

Zip

RFM

Orders

Line item status

Extended Cost/Price

Quantity

Line number

Pay Methods

Ship Methods

Items

Item Number

Item description

Item Category

Item Status

Sources

Source

Offer

Use

Campaign Statics

 

 

Screen Shots

Below are some sample screen shots to show you the power of the SQL Server 2005 Analysis Services. All of these reports were created dynamically through the Excel presentaion layer within a matter of minutes. No programming or SQL knowlege is needed. The reports are created in real-time by dragging and droping dimensions and measurements onto the Excel Pviot Table. The reports can also by deployed to a SQL 2005/2008 Web Reporting Server. Where you can actually schedule the reports as jobs and have them emailed to you in a number of formats.

 

If you are using Excel 2010, there is much more functionality using PowerPiviot Tables:

"PowerPivot for Excel is a data analysis tool that delivers unmatched computational power directly within the application users already know and love—Microsoft Excel. It’s the user-friendly way to perform data analysis using familiar Excel features you already know, such as the Office Fluent user interface, PivotTable and, PivotChart views, and slicers. It’s the fast way to generate rich and interactive analysis tools, easily mistaken for IT solutions after weeks of effort. It’s the right way to achieve deeper business insight and shorter decision cycles."

Click Here for more info

 

Screen Shot of Offer "on demand" sales by year and quater. Notice the "+" and "-". This is the drill down/up functionality. ( click to zoom in)

 

 

Screen shot of offer sales by state and item number. Filtered by division, year and source ( click to zoom in)

 

 

Screen shot of offer sales by order numer, item and item description. Filtered by division, year and source, month and ship method ( click to zoom in)

 

 

Screen shot of order status and sales. Filtered by division, year and source, month and ship method ( click to zoom in)

 

 

 

The next post will walk you through the process of creating your data cube. This will be step by step walkthrough on creating an Analysis Service Project in Visual Studios.

 

Let me know if you have any questions!

 

 

Tags: , , ,

Analytics | Ecometry® | Reporting

Expand Reporting with Google Analytics

by Brennan Mann 27. April 2010 06:46

Before  I post the article on expanding reporting with ASP .Net 4.0, I want to show you a solution you can implement in a couple of hours. The folks at Google have put together an astonishing package of reports and analytics.  If you have not already, sign up for your free Google Analytics account (click here to sign up here).  Google offers us a gold mine of reporting tools.  Some of the reports include:

  • Pageviews
  • Average Pageviews
  • Visits
  • Visitor Overviews
  • Time on site
  • New vs Returning
  • e-Commerce Tracking
    • Order Header Level Detail
    • Order Detail Level Detail
  • Depth of Change
  • Loyalty
  • User-defined Reports
  • Analytics Intelligence
  • Dashboards
  • Custom Variables
  • Data Export  - Incliudes Excel, CSV, PDF and tab delimited files

Screen Shot of Home Page and Dashboard ( click on the image to zoom in)

 

Screen Shot of e-Commerce Tracking Home Page ( click on the image to zoom in):

Product Statistics

 

Order Level Transactions - Includes Ecometry order number, order total, tax and P&H. Click on the order number to drill-down on the order details ( click on the image to zoom in).

 

How does all this work?

A simple cut and paste ( Thanks to Google).

Overview of the process:

 

1.) Customer confirms his/her order on the order recap page.

2.) The website processes the order and sends the data to Ecometry's WEBORDER

3.) WEBORDER responds to the request and acknowledges the order

4.) Order Confirmation is displayed to the customer

5.) After the webpage is rendered, we submit a transaction to Google. This transaction happens behind the scenes. It can take up to 24 hours for your transaction level detail to be visible in Google Analytics.

 

Over the weekend, I implemented Google's e-Commerce Analytic tracking on an Ecometry e-Commerce website . The e-Commecre tracking provides a rich set of tools to track conversion rates, transactions, purchased products, and transactions. The entire setup took about 30 minutes.  Google really does all the work. We use already written JavaScript provided by Google to communicate with the Google Analytic servers. There are three JavaScript methods required for e-Commerce tracking.

 

Three JavaScript Methods are used:

 

1.) _addTrans()

This method initializes/creates the transaction object. The transaction object collects all the information about the a single transaction ( order ids, shipping charges, billing address...).

2.) _addItem()

The addItem() method tracks individual item transactions( SKU, price, category and quantity).

3.) _trackTrans()

The trackTrans() method finalizes the transaction to Google. It usually takes about 24 hours for all the transaction data to be available on Google.

 

Basically, we are going to add three JavaScript Snippets to the order confirmation page. The instructions below only apply to EEM ( aka Ecometry .Net Framework).

 

1.) The first snippet initializes the tracking object and add order header level information. I placed this snippet at the top of the order confirmation "aspx" web control/page.

<script type="text/javascript">

  var _gaq = _gaq || [];
  _gaq.push(['_setAccount', 'UA-111111-1']);
  _gaq.push(['_trackPageview']);
  _gaq.push(['_addTrans',
    '<%=Cart.OrderNo%>',           // order ID - required
    'My Shopping Cart',  // affiliation or store name
    '<%=Cart.TotalAmount.ToString()%>',          // total - required
    '<%=Cart.TaxAmount.ToString()%>',           // tax
    '<%=Cart.ShippingAmount.ToString()%>',              // shipping
    '<%=Cart.Buyer.City%>',       // city
    '<%=Cart.Buyer.State%>',     // state or province
    'USA'             // country
  ]);

</script>

 

2.) The second snippet adds the order details to the tracking object.  You will need to place this snippet in the "foreach" cart lines loop.

<script type="text/javascript">

<%
                 
         
     foreach(CartLine line in  ship.CartLines) {   

%>
    
<script type="text/javascript">
   // add item might be called for every item in the shopping cart
   // where your ecommerce engine loops through each item in the cart and
   // prints out _addItem for each
  _gaq.push(['_addItem',
    '<%=Cart.OrderNo%>',           // order ID - required
    '<%=line.Product.ItemNo%>',           // SKU/code
    '<%=line.Product.Description%>',        // product name
    'DR',   // category or variation
    '<%=line.Price.ToString()%>',          // unit price - required
    '<%=line.Quantity.ToString()%>'               // quantity - required

  ]);            
 </script>
 <%} %>  

*Do not copy the "foreach" code snippet. I only included this to demostrate the JavaScript placement. Please use the one already in place on your order confirmation page.

3.) The third snippet sends the data to google. I placed this snippet after the last line of HTML/ASP in the order confirmation "aspx" web control/page.

<script type="text/javascript">

   _gaq.push(['_trackTrans']); //submits transaction to the Analytics servers

  (function() {
    var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
    (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(ga);
  })();
</script>

That's it. Give Google about  24 hours to digest the data and you're off to the races.

A few suggestions:

  • Make sure you enable your Google Analytics web profile as an e-Commerce website. If you don't, the e-Commerce tracking will not work.
  • Make sure your Google Analytics are working with the main tracking object before adding the e-Commerce tracking. Every website profile you configure is assigned a Google Analytics tracking number ( GA-1111111-1). Google will provide you a tracking JavaScript snippet to be placed on all your pages. This is not the same as the e-Commerce setup.  If you are using EEM ( aka Ecometry .Net), you should place the JavaScript snippet in your HTML master template(s) or in the website footer page. Google recommends placing the JavaScript snippet right before the closing "</body>" HTML element.
  • When placing the JavaScript snippets, make sure you are in scope to use the ASP variables. If you are not, a null exception will be thrown.
  • Do your testing on a test website.

 

Happy data hunting.

Please feel free to ask any questions!

 

 

 

 

Tags: , ,

Analytics | Ecometry® | Programming | Reporting

Ecometry® Cron Jobs with Email Alerts

by Brennan Mann 24. April 2010 21:21

Having had system critical jobs running in the early hours of the morning, I thought it would nice to be able to wake up and have the job results waiting on my Smartphone. I created a HTML email application that could be easily placed in existing Batch/Cron jobs. The email application will send standard lists, dynamic attachments and dynamic reports. The application also includes a  customizable HTML template.  With just a few lines added to your existing Batch/Cron jobs, you are on your way.

Sample Screen Shot:

 

1.) Download the Email Job Notification Application from:

http://www.plasmawebservices.com/downloads

Run the setup.exe and extract the files. By default, the program will be installed into C:\program files\pws\emailjobnotification

The following files and folders will be created in <Install Directory>\EmailJobNotification Install:

  • EmailJobNotification.exe - Acutall email application
  • EmailJobNotification.exe.config - Application Config file ( contain SMTP settings )
  • test.bat - Test batch file for confirm SMTP connectivity 
  • favicon.ico - Application Icon
  • templates - Folder contain HTML and Image templates
  • templates\email.html - Email template
  • templates\jobPro_logo.gif - Email Image

You will need to configure the XML configuration file with your SMTP values. Edit the EmailJobNotification.exe.config configuration file in Notepad.

The file is located in the root of the installed directory. C:\program files\pws\emailjobnotification\EmailJobNotification.exe.config

        <EmailJobNotification.Properties.Settings>
            <setting name="smtpHost" serializeAs="String">
                <value>mail.myhost.com</value>
            </setting>
            <setting name="smtpUser" serializeAs="String">
                <value>smtpUser@test.com</value>
            </setting>
            <setting name="smtpPassword" serializeAs="String">
                <value>mypassword</value>
            </setting>
            <setting name="emailTo" serializeAs="String">
                <value>to_email@test.com</value>
            </setting>
            <setting name="emailFrom" serializeAs="String">
                <value>JobPro Support &lt;from_email@test.com&gt;</value>
            </setting>
            <setting name="emailSubject" serializeAs="String">
                <value>JobPro Notification</value>
            </setting>
            <setting name="emailCC" serializeAs="String">
                <value />
            </setting>
            <setting name="sTesting" serializeAs="String">
                <value>False</value>
            </setting>
            <setting name="sJobName" serializeAs="String">
                <value>My Job Name</value>
            </setting>
            <setting name="emailAttachment" serializeAs="String">
                <value />
            </setting>
        </EmailJobNotification.Properties.Settings>

The job name, to, cc and subject can be set in the batch file as environmental variables. If the environmental variables are present, the values in the EmailJobNotification.exe.config will be overridden.

2.) Create a test batch file to test your SMTP settings

Below is a sample test batch file. The quickest editor to use to create or modify batch files is Notepad. The sample below is included in the email application ( reference the "batch files" folder in the application folder ).

@echo off

: Set the Job Name to appear in the email
set JOBNAME=This is a test

: Path to your cron job output or log file
set LOGFILE=c:\new_log.txt

: Set email subject
set EMAIL_SUBJECT=JobPro Notification - This is a Test

: set email Recipients To list ( use ","to separate email addresses)  
set EMAIL_TO=test@test.com, test2@test.com

: set email Recipients CC list ( use ","to separate email addresses)
set EMAIL_CC=test3@test.com, test3@test.com

:Add Attachment 1
set ATTACHMENT1=c:\program_output.txt

:Add Attachment 2
set ATTACHMENT2=c:\program_report.txt

:Send Email
c:\pws\EmailJobNotification.exe

If you are not familiar with batch scripts, the ":" or "rem" can be used to make comments in the batch script.

3.) Add the new batch commands to your existing cron jobs. Below is an example of a production job:

@echo off

: *************************************************************
: *
: *  Who     Date            Comments
: *  -----   ---------       ---------------------------
: *  BM      09/01/09     Run order processing by division,
: *                                      send collates to printer and
: *                                      update PRINTJOB table
: *
: *************************************************************

: Division 01
echo Order Processing > E:\ECOMETRY\JOBPRO\logs\inspkjb1_log.txt
cronsched.exe E:\ECOMETRY\LIVECODE\Userstr2\inspkjb1 MGR 31000 E:\ECOMETRY\LISTENER\ECOMLIVE.ini >> E:\ECOMETRY\JOBPRO\logs\inspkjb1_log.txt
cscript /nologo E:\ECOMETRY\JOBPRO\VBS\orderprocessing.vbs 1 > E:\ECOMETRY\JOBPRO\logs\orderprocessing.txt

: Send Email Alert for division 01
set JOBNAME=Order Processing
set LOGFILE=E:\ECOMETRY\JOBPRO\logs\inspkjb1_log.txt
set EMAIL_SUBJECT=JobPro Notification - Order Processing Division 01
set EMAIL_TO=brennan@plasmawebservices.com
c:
cd c:\pws\email\
EmailJobNotification

: Division 02
cronsched.exe E:\ECOMETRY\LIVECODE\Userstr2\inspkjb2 MGR 31000 E:\ECOMETRY\LISTENER\ECOMLIVE.ini > E:\ECOMETRY\JOBPRO\logs\inspkjb2_log.txt
cscript /nologo E:\ECOMETRY\JOBPRO\VBS\orderprocessing.vbs 2 >> E:\ECOMETRY\JOBPRO\logs\orderprocessing.txt

: Send Email Alert for division 02
set JOBNAME=Order Processing
set LOGFILE=E:\ECOMETRY\JOBPRO\logs\inspkjb2_log.txt
set EMAIL_SUBJECT=JobPro Notification - Order Processing Division 02
set EMAIL_TO=brennan@plasmawebservices.com
c:
cd c:\pws\email\
EmailJobNotification

:End of Batch File

In the example above, we are "redirecting" the output from the cron job to a log file.  Basically instead of "seeing" the output from cronsched.exe in the command window, we are sending the output to a log file. The email application will parse the log file and find the job number and standlist. The standlist will be attached in the email notification.

For more information on MS-DOS redirection operators, please checkout the following URL:

http://www.febooti.com/products/command-line-email/batch-files/ms-dos-command-redirection.html

 

If you have any questions, please feel free to ask.

 

Next week, using ASP.Net 4.0 to create web portal for barometers and "drill-down" data reporting.

 

Tags: , , ,

Ecometry® | Programming

About the author

Brennan Mann started working on the Ecometry® platform in 2000. For the past ten years, Brennan has provided custom solutions and consultation for Ecometry® and other e-commerce platforms. In 2006, Brennan entered the realm of  Microsoft's .Net Technology.