Andornot Blog

Monday, April 22, 2013 3:25 PM

SmarterStats and how to batch add IP address filters

by Ted Jardine

If you're using SmarterStats as one means of tracking site traffic (if site stats mean anything to you you should be using at least two means of tracking) that is also monitored by Pingdom (or any other monitoring service, but we highly recommend Pingdom), you'll find your stats are *slightly* increased by Pingdom's bots (which come from around 30 monitoring locations around the world with unique IP addresses for each). For example, if you're pinging your site once per minute that adds up to somewhere around 43,000 visits per month - a not so lovely way to pad your stats.

Fortunately, SmarterStats provides a means to filter out these requests. Unfortunately, SmarterStats naively only allows for one-by-one entry via a relatively clunky interface and can only filter on IP address, IP address range, host header inclusion/exclusion, file, and directory. We have contacted SmarterStats multiple times in the past for a means to batch import IP addresses or better yet, filter requests based on a specific user-agent request header (which you can set in Pingdom), but no solution has been forthcoming. We've added Pingdom-specific host headers where feasible, but for the most part we've had to make do with a very manual (and time consuming) process.

This state of affairs is so wrong on so many levels so here I am to say "no more!" Bring out some scripting and enough already. 

Four caveats:

  1. Tested only with SmarterStats v6. Future or past versions may not operate similarily. 
  2. You'll need direct access to your server's SmarterStats config files. 
  3. The script is for *nix-based systems and is only tested on a Mac. Therefore, Linux (whatever flavour as long as it's got uuidgen available) should work fine. Windows will be a no go. Sorry. Feel free to write a Powershell equivalent.
  4. If this busts something or kills your hamster, we're not responsible.

As the Lorax would say, "You have been warned!"

The old way

For fun and jollies, review the usual, and until now only way, to manually add an IP address filter in SmarterStats:

  1. Log in as a site admin in SmarterStats.
  2. Go to "Settings".
  3. Go to "General Settings".
  4. Go to "Import Filtering".
  5. Click "Add > Import Filter".
  6. Select your import filter type, add your IP address, check "Make exclusions permanent" and click "Okay".
  7. Look at your list of remaining IP addresses.
  8. Look at the SmarterStats interface.
  9. Add another IP address.
  10. Forget to click save before navigating somewhere else. Lose all your entries.
  11. Look at your list of remaining IP addresses.
  12. Squint and try to determine which IP addresses you added and which ones you missed.
  13. Count them again.
  14. Cry.

And now for enlightenment

The first step is getting the goodies in the script. Relatively simple, but it does the trick:

# DESCRIPTION: Script to generate a list of all the latest
# Pingdom monitoring servers and export into an XML format 
# recognized by SmarterStats in a site's configuration file 
# (typically located in C:\Program Files (x86)\SmarterTools\SmarterStats\MRS\App_Data\Config\Sites\Site##.xml)
#
# USAGE: ./pingdom.sh | pbcopy to copy to clipboard # or ./pingdom.sh to simply output to console.

ips=`wget --quiet -O- https://my.pingdom.com/probes/feed | \
        grep "pingdom:ip" | \
        sed -e 's|</.*||' -e 's|.*>||'`

for ip in $ips
do
        guid=$(uuidgen | tr '[:upper:]' '[:lower:]' | tr -d '-')

        echo "<ImportFilter>
        <Guid>"$guid"</Guid>
        <FilterValue>$ip</FilterValue>
        <MatchBehavior>Exclude</MatchBehavior>
        <Type>Ip</Type>
        <Permanent>True</Permanent>
</ImportFilter>"
done
All the above does is get the current list of Pingdom's monitoring servers, parses out the IP addresses, and loops through each while generating a unique UUID (GUID for Windows speakers) for each along with the surrounding necessary bits and pieces. Once you've got the above script located somewhere nice on your local machine, do the following:

  1. Contact SmarterStats and ask them why they don't have a filter based on a request header user agent already.
  2. Make the above script file executable. For example, assuming the script file is at ~/pingdom.sh, simply:

    chmod +x ~/pingdom.sh
  3. Generate the necessary XML and stick it in your clipboard by running the following (adjusting file name and location as necessary):

    ./pingdom.sh | pbcopy
  4. Determine the applicable site ID/s. The easiest way is in SmarterStats site listing.
  5. Make sure the site in question is not currently running an import. You may wish to stop your SmarterStats service. YMMV.
  6. Locate the site's respective config file on the site's host server. This file is typically in C:\Program Files (x86)\SmarterTools\SmarterStats\MRS\App_Data\Config\Sites\Site##.xml where the ## is the site ID determined in the previous steps. 
  7. If it will be a lot of work to recreate the site from scratch, make a backup of the config file before proceeding any further.
  8. Open the config file in your text editor and right after the <TimeZoneIndex> entry, add in your multiple <ImportFilter> entries which should be in your clipboard. If you can't find this, make a dummy filter request entry via the Web GUI and then located it in the config file afterwards.
  9. Save after making sure you didn't muff anything up.
  10. Reload your filter settings for the site and see all 30+ entries. Rejoice. Re-index if you need to filter out bad data from before. Rejoice again.
  11. Repeat for all sites that use SmarterStats and are monitored by Pingdom. Note that you'll want to regenerate the XML in order to have unique UUID's for each entry across all sites.
Thursday, May 24, 2012 12:11 PM

Alfred! Fix my external display: Alfred, Mac OS X, and how you might be "plugging it in wrong"

by Ted Jardine

I've got a MacBook Air that I am regularly plugging an external display in and out of. If you plug in a display when your MacBook is sleeping or turned off there are (typically) no problems. However, if your MacBook is on when you attempt to plug in an external display, it never automatically registers that a new display has been connected and therefore never turns it on accordingly. Insert expletive here. Yes Virginia, Mac OS X does have its baffling quirks and bugs.

In order to fix you have to manually go to System Preferences each time an external display is connected, go into the Displays preference pane, and then manually click on the "Detect Displays" button. Even worse, if you have an external display set as the main display and you unplug said external display you're left fumbling around in the dark bringing up the system preferences with keyboard shortcuts; Alfred's "Displays" shortcut helps get a little further along but still… Ack!

Instead of ranting further, I present my solution: "Alfred! Fix my displays!" 

 

  1. What?! You don't use Alfred yet? Download Alfred here
  2. What?! You're not using the Alfred PowerPack yet? It's not necessary for this script, but just get it anyways: Download the Alfred Powerpack here
  3. Download the "Detect Displays" Alfred extension via the link below


Now, whenever you need to give your MacBook/MacBook Air/what-have-you a kick in the virtual desktop pants, just use Alfred to run the "fix display" keyword and you'll no longer be "plugging it in wrong" ;-)

--

Full credit goes to Ravi K. Udeshi for the original applescript that can be found at http://raviudeshi.com/2011/03/automatically-detect-displays

 

Detect Display.alfredextension (5.50 kb)

Mac

Monday, September 06, 2010 12:41 PM

Errors: Sending the Right Message (Redux Covering ASP.NET 3.5/4.0)

by Ted Jardine

If you've read and followed up on my previous posts about handling errors, you might have found yourself pulling out your hair when you discovered that your error handling went south again when you changed your target ASP.NET framework from 2.0 to 3.5/4.0 (applies to WebForms only of course). So here I am to save your day again (and yes, I wasted an entire Friday evening figuring this one out - good times).

In a nutshell, upgrade to 3.5 and your awesomely handled 500 and 404 errors start going back to 302 and 200 redirects the ol' fashioned (and completely idiotic) ASP.NET way of doing things. Of course, if you are like me, you'd waste an entire evening on it. But fortunately, you're not so you won't.

There's two issues/solutions:

  1. As of ASP.NET 3.5 there's now a redirectMode="ResponseRewrite" setting for your customErrors setting in your Web.config (yeah!). Put it in and you're halfway there (the default is the aforementioned moronic "ResponseRedirect"). Now your:
  2.             Context.Response.Clear();
                Context.Response.TrySkipIisCustomErrors = true;
                Context.Response.StatusCode = statusCode;
                Context.Response.Status = status;
               
                Context.Server.Transfer(errorPage, false);
    is back firing on all cylinders again.
  3. Except that your 404s are fine now, but your 500 errors return your error page twice in the body of your error page! Genius that I am, I went back to my experiences with IIS 6 and wrapped error pages and figured out that IIS 7/7.5 was sending back "its own" error page and then my server transfer was getting tacked on as well. Again, this does not effect 404s - I'm guessing because we clear the error (see previous post). So remove the Server.Transfer for non-404s and we're golden again.

In summary, for IIS 7/7.5 and .NET 3.5/4.0, use the following as a starting point for your Global.asax.cs or, better yet, an error module (for good measure figure out how it could be all tested):

        private readonly static ILog Log = LogManager.GetLogger(typeof(Global));
        private string _errorPageLocation;
        private string _error404PageLocation;


        protected void Application_Start(object sender, EventArgs e)
        {
            // Wire up log4net database connection string if desired.


            if (Log.IsInfoEnabled)
                Log.Info("Log4Net initialized - Ignore.");
        }


        protected void Application_Error(object sender, EventArgs e)
        {
            InitConfigurationFields();


            Exception exception = Context.Server.GetLastError();

            if (exception is HttpUnhandledException)
                if (exception.InnerException != null)
                    exception = exception.InnerException;


            if (exception is HttpException)
            {
                var ex = (HttpException)exception;
                var statusCode = ex.GetHttpCode();

                if (statusCode == 404)
                {
                    ServerTransfer(_error404PageLocation,
"404 Not Found", statusCode);
                    return;
                }
            }


            Log.Error("Unhandled error caught in error module.", exception);

            //Todo: Any way to get the correct status statement for a specifc code?
//Hard-coding all to "500 Internal Server Error" here.
            ServerTransfer(_errorPageLocation, "500 Internal Server Error", 500);
        }


        /// <summary>
        /// Gets configuration fields for database connection (if applicable),
/// and customError redirects.
        /// </summary>
        private void InitConfigurationFields()
        {
            var section = WebConfigurationManager.GetSection("system.web/customErrors");

            if (section != null && section is CustomErrorsSection)
            {
                var customErrorsSection = (CustomErrorsSection)section;
                _errorPageLocation = customErrorsSection.DefaultRedirect;

                var error404 = customErrorsSection.Errors["404"];

                if (error404 != null)
                    _error404PageLocation = error404.Redirect;
            }


            if (string.IsNullOrEmpty(_errorPageLocation))
                throw new NullReferenceException(
"customErrors DefaultRedirect must be specified in Web.config.
...For i.e. ~/error.aspx"
);


            if (string.IsNullOrEmpty(_error404PageLocation))
                throw new NullReferenceException(
"customErrors 404 statusCode redirect must be specified
...in Web.config. For i.e. ~/page-not-found.aspx"
);
        }


        private void ServerTransfer(string errorPage, string status, int statusCode)
        {
            // If customErrors is off, just let ASP.NET default happen.
            if (Context == null || !Context.IsCustomErrorEnabled)
                return;


            // Want the error around so that we can provide a little more
// descriptive message to end user in error page.
            // However, for 404 errors (only!), not clearing the error causes IIS7 to
// still hijack the process and the custom 404 error page.
            if (statusCode == 404)
                Server.ClearError();

            Context.Response.Clear();
            Context.Response.TrySkipIisCustomErrors = true;
            Context.Response.StatusCode = statusCode;
            Context.Response.Status = status;

            // For .NET 3.5, doing a Server.Transfer combined with customErrors
// redirectMode="ResponseRewrite" returns the error page body *twice*.
            if (statusCode == 404)
                Context.Server.Transfer(errorPage, false);
        }
Wednesday, June 09, 2010 12:15 PM

Content Server (aka "DB/Text for SQL") Backup Script Utility

by Ted Jardine

If you are using Content Server (now known as "DB/Text for SQL"), it goes without saying that you should have regular automatic backups in place: regularly verified and regularly "test" restored. You DO have such a plan in place, right? Right?!?

Dilbert.com

For backing up DB/Text for SQL, it's a touch more involved than a typical SQL Server backup, simply because there's data both in and out of SQL Server and both need to be backed up at the same time. Moreover, SQL Server databases are not able to be backed up by typical backup mechanisms (to oversimplify, the actual files are always locked).

Fortunately, a manual backup of CS is pretty easy: just open up your CS Admin and follow the prompts to back up everything in one fell swoop into one handy .dat backup file. The bad news with that however, is that manual backups are a half-baked solution; backups need to be automated or they just won't get done often enough (if at all).

Options for automated CS backups:

  1. Back up the SQL Server aspects of CS as part of your regular automated SQL Server backups. And ensure that the external files are backed up at the same time. This solution will likely require the involvement of your IT staff. Moreover, automated SQL Server backups require SQL Server Agent which is not included with SQL Server Express (a requirement at least for a backup process that isn't the digital equivalent of the hokey-pokey dance crossed with a waltz). See your Administrator's Guide for details (for backup instructions, not dance tips).
  2. Use the built-in CS Admin backup capabilities, but automate them by plunking some CS-specific backup scripts in a batch file and scheduling them with Windows Task Scheduler.

Aha, you SQL-Server-Express-loving-person! Option two sounds great! Let's do it! Not so fast...have you seen those scripts? If you've got any more than one or two textbases, you'll go blind trying to accurately create and maintain those batch files. And that's where our new handy-dandy CS Backup Script Generator comes in. With it, you can quickly and easily generate accurate backup scripts contained in an automatically generated batch file that can be automatically run with a scheduled task. Did I mention automatically!?

Backups + Automation = Sweetness.

CS Backup Script Generator

 

  1. Download the Andornot CS Backup Script Generator here (yes, it's priceless so it's free) and install. Note that you can install the utility on any Windows machine as it does not need to be on your CS server.
  2. If you don't already have the Microsoft .NET Framework 4.0 installed, download it here and then install.
  3. Run the utility (should be in your start menu with a link on your desktop) and fill in three fields:
    1. SQL Server Instance: the name of your SQL Server instance, such as MachineName\SQLEXPRESS (if you don't know it, you can find it easily in your CS Admin).
    2. Backup directory: where you wish to have all backup files saved to (such as "D:\Backups\").
    3. Textbase Information Text: the text from the "List Textbases" summary within CS Admin (just select all the text in the summary, making sure to include the entire textbase listing such as the following:

      ...
      Textbase: D:\data\Barcodes
        SQL Database: '_InmTB_18'
      Textbase: D:\data\Borrower
        SQL Database: '_InmTB_19'
      Textbase: D:\data\Catalog
        SQL Database: '_InmTB_20'
      Textbase: D:\data\Contacts
        SQL Database: '_InmTB_21'
      Textbase: D:\data\Loans
        SQL Database: '_InmTB_22'
      ...

      The utility is smart enough to pick up the list of textbases from all the text in the summary, so you can paste in as much or as little of the summary as you like, as long as it includes the textbase listing. Get to the listing in CS Admin via the "Manage Textbases" > "List Textbases" menu item.
    4. Hit "Preview" to take a look at your new masterpiece (and shudder to think of doing that manually especially if you've got 10+ textbases) and/or "Batch it!" to save a .bat file to the location and name of your choosing.

      CS Backup Script Generator Preview
    5. Test the generated batch file by double-clicking it. After a few minutes (or less depending on the size of your database), there should now be a  .log and .dat file for each of your textbases in your specified backup directory.
    6. Review the .log file for any errors or warnings.
    7. Schedule the batch file to run daily during an off-peak time.
    8. Regularly review the generated log files for any errors or warnings and verify that backups are taking place as required.
    9. Ensure that your backup directory itself is backed up off-server and off-site for disaster recovery. The generated .dat files can be simply x-copied (copied and pasted) to another location, or better yet, automatically backed up with your server's backup software as part of your server's regular backup routine.
    10. Every so often, test out restoring your .dat backups via the CS Admin.

Remember, "hell hath no fury like data scorned." Please love your data and back it up.

Disclaimer: this backup utility is very much "beta" quality. I'm not responsible for anything it may or may not do to your system (bad or otherwise). Use at your own risk. Having said that, the setup program simply places the .exe with some support DLLs on your system, and when the utility is run, you can easily review the batch files it generates before running them (which you could generate manually yourself if you prefer).

Wednesday, May 12, 2010 10:10 AM

Adding Google Analytics without touching your site

by Ted Jardine

I had a problem: Kathy and Denise wanted Google Analytics configured for our demo Genie site that can be found at http://genie.andornot.com. However, Genie doesn't provide an easy way to centrally add the couple lines of Google Analytics code to every page served up by the application. For that matter, neither does any WebPublisher PRO site right out-of-the-box (easy-peasy though if we're using ASK and our WebPubResults control).

You might also have the same problem: you recognize that having multiple means of tracking and analyzing your site's traffic is no longer optional. However, while you've got server log file analysis handled (we use SmarterStats for all our hosted clients) your site doesn't have a single central place (such as site-wide central templates) to easily integrate a javascript-based page tagging solution such as Google Analytics.

So what do we do? Instead of updating 10s if not 100s or 1000s of files manually one-by-one (crossing my fingers that I could find them all) and hoping my code wouldn't need to change anytime soon, I came up with a modified version of the Web Analytics Tracking Module available on Microsoft's IIS.net site. Now, with a little elbow grease for initial server setup (seriously, not very much elbow grease at all), turning on site tracking for any site on our servers is simply a matter of placing a DLL in our application's bin directory, and adding a couple lines to our site's Web.config. Sweet.

Why not just use the original IIS.net Web Analytics Module?

Before jumping in and explaining how to set everything up, I should explain that I needed create my own custom build of the module because the module on IIS.net only works under IIS 7's integrated pipeline, not under IIS 7's classic pipeline (which Genie needs to run under) or IIS 6. The required changes ended up being relatively minor:

  1. Modified the ReadModuleConfiguration method in WebAnalyticsHttpModule.cs to the following:
  2. /// <summary>
    
    /// Reads the module specific configuration properties
    
    /// </summary>
    
    /// <param name="context"></param>
    
    /// <returns>Boolean indicating the success/failure</returns>
    
    private bool ReadModuleConfiguration(HttpContext context)
    
    {
    
         try
    
         {
    
             ConfigurationSection section = null;
    
             if (HttpRuntime.UsingIntegratedPipeline)
    
                section = WebConfigurationManager.GetSection(context, "system.webServer/webAnalytics", typeof(WebAnalyticsSection));
    
             else
    
                section = WebConfigurationManager.GetSection(context, "system.web/webAnalytics", typeof(WebAnalyticsSection));
    
              if (section != null)
    
                 _webAnalyticsModuleConfig = (WebAnalyticsSection)section;
    
         }
    
         catch (Exception)
    
         {
    
             return false;
    
         }
    
           return _webAnalyticsModuleConfig != null;
    
        }
    
    }
  3. Add the classic pipeline <system.web>/<webAnalytics> schema to WebAnalytics_schema.xml:
  4. <sectionSchema name="system.web/webAnalytics">
    
        <attribute name="trackingEnabled" type="bool" defautlValue="false"></attribute>
    
        <attribute name="trackingScript" type="string"  defautlValue="This is a default text"></attribute>
    
        <attribute name="insertionPoint" type="enum" defaultValue="body">
    
            <enum name="head" value="0" />
    
            <enum name="body" value="1" />
    
        </attribute>
    
    </sectionSchema>
    
  5. To make sure there's no versioning conflicts, I updated the Assembly's version to v1.1 in AssemblyInfo.cs (note the v1.1 references in the instructions below as opposed to v1.0):
  6. [assembly: AssemblyVersion("1.1.0.0")]
    
    [assembly: AssemblyFileVersion("1.1.0.0")]

 

Configure your server (one time only)

  1. Register the WebAnalyticsModule.dll in the GAC (for shiny stuff in the IIS 7 Manager GUI).

    gacutil -if WebAnalyticsModule.dll
  2. Copy the WebAnalytics_schema.xml to "%windir%\system32\inetsrv\config\schema" folder.
  3. Add the following section definition to the "%windir%\system32\inetsrv\config\applicationhost.config" file in the sectionGroup for "system.webServer"
  4. <section name="webAnalytics" overrideModeDefault="Allow" />
  5. Add the module to the IIS Manager configuration by adding to two collections in the "%windir%\system32\inetsrv\config\administration.config" file:
    • Add the following to the moduleProviders collection:
    • <add name="WebAnalytics" type="WebAnalyticsModule.WebAnalyticsProvider, WebAnalyticsModule, Version=1.1.0.0, Culture=neutral, PublicKeyToken=c6b7132bcfe43312" /> 
      
    • Add the following to the modules collection:
    • <add name="WebAnalytics" />


web-analytics-tracking-in-iis
Web Analytics Tracking now directly available in IIS 7 Manager (at least for sites running under Integrated Pipeline).

 

Configure your site/s

For each site you wish to have Google Analytics enabled on (or any other script/text/html you wish to have run on every page in the site), enable it according to the following instructions (slightly different for different flavours of IIS/pipeline).

IIS 7 Integrated Pipeline

  1. Place the same WebAnalyticsModule.dll in your application's bin directory or reference the GAC version via the Web.config's <system.web>/<compilation>/<assemblies> element.
  2. Add the module to the <modules> element in your Web.config. This can be done through the IIS Manager, but it's easier to simply add it directly in the Web.config:
  3. <modules>
    
      ...
    
      <add name="WebAnalytics" type="WebAnalyticsModule.WebAnalyticsHttpModule, WebAnalyticsModule, Version=1.1.0.0, Culture=neutral, PublicKeyToken=c6b7132bcfe43312" />
    
    </modules>
  4. Specify the script/text you wish to have inserted and the location to insert it at. Again, this can be added directly in the Web.config. However, because the actual script/text needs to be encoded in order to keep your config file from going boink, it's a lot easier to just use the GUI.

  5. web-analytics-tracking-in-iis-configuration
    Easy configuration within the IIS 7 Manager GUI, including properly encoding the script for the Web.config (safe for XML).

    Enabling the above adds a <webAnalytics> element to the <system.webServer> section, looking something like (note the encoding):
    <webAnalytics trackingEnabled="true" trackingScript="&lt;script type=&quot;text/javascript&quot;>&#xD;&#xA;  var _gaq = _gaq || [];&#xD;&#xA;  _gaq.push(['_setAccount', 'UA-494411-1']);&#xD;&#xA;  _gaq.push(['_setDomainName', '.andornot.com']);&#xD;&#xA;  _gaq.push(['_trackPageview']);&#xD;&#xA;  (function() {&#xD;&#xA;    var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;&#xD;&#xA;    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';&#xD;&#xA;    var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);&#xD;&#xA;  })();&#xD;&#xA;&lt;/script>" insertionPoint="head" />

 

IIS 7 Classic Pipeline

Exactly the same as the IIS 7 integrated pipeline instructions above, with the following changes:

  1. Add the module reference to the classic pipeline specific <system.web>/<httpModules> section instead of the <system.webServer>/<modules> section.
  2. Make it easy for yourself, and add the script-specific settings (as in step 3 above) via the GUI in order to make sure it's all encoded properly, but then move/copy the resulting <webAnalytics> element from <system.webServer> (IIS 7 integrated pipeline specific) to <system.web>.
  3. Add the following to the <configuration>/<configSections> element in your Web.config (there should be a better way to add another sub-section to the already defined <system.web> declaration in the <configSections> element, so let me know in the comments if you know what it is):
  4. <sectionGroup name="system.web" type="System.Web.Configuration.SystemWebSectionGroup, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"> 
    
        <section name="webAnalytics" type="WebAnalyticsModule.WebAnalyticsSection, WebAnalyticsModule" /> 
    
    </sectionGroup> 
    

Caveat: as your app is running under classic pipeline, not all text/html resources are running through ASP.NET, and therefore, if you have for i.e. .html and/or .asp pages you want the script automatically added to, you will need to map those extensions to also route through the ASP.NET runtime. Pretty standard classic pipeline stuff: without doing so, only pages with the .aspx extension will have the script added.

 

IIS 6 (Classic Pipeline)

I haven't tested it, but I see no reason why it won't work with the same set up as IIS 7 Classic Pipeline, with the same caveats.

Notes:

  1. As you might have guessed, you can automatically insert *anything* using this module, as long as you don't mind it inserting just before the closing </head> tag or the closing </body> tag. Copyright statements, survey scripts, "Ted wuz here" alerts for every page...if you think of it, you can do it!
  2. Again, the fancy GUI stuff doesn't help you in classic pipeline scenarios (other than an XML-encoding aid as described above).
  3. Typically you want to register scripts as close to the end of your page as possible (various technical performance and usability reasons). However, Google Analytics' latest scripts have a "push" functionality making this a moot point: so register the latest scripts right before the closing </head> tag. See the "Asynchronous Tracking Usage Guide" for more information.

Different web analytics packages have different strengths and weaknesses, and it's only with multiple different perspectives on your site traffic that you even begin to get a clear picture of what is really happening with your site. Using this web analytics module makes it even easier to add and maintain a page tagging solution on your site.

 

Download

I'd like to just directly link to my compiled DLL and schema file of the module, but unfortunately, it's not at all clear what licensing model is associated with the original module (although it's clear from his article explaining the module, that Microsoft's Ruslan Yakushev intends for us to use it). Therefore, I'll be requesting clarification on this issue, but in the meantime, just download the source code from the original article and make the quick modifications as described above. If you have problems doing so, or you want help deploying it for your site, drop us a line.

Month List