C#, Software Developement, Projects, Games »

[28 Aug 2014 | 341 Comments]

I like playing CounterStrike Source (CS:S). I like playing against my friends at LAN parties (yes we actually still physically gather with our pc’s at some location and network them up and play against each other, we call it Lanfix).We then like to know how I did. We used to use to use Psychostats to generate a visual dashboard of what happened in game but it was no longer maintained and did not support the log format now coming out of CS:S.

@naiboss put out a request on the Lanfix mailing list to see if someone could tweak the Psyhcostats code to work with the log files I though I would have a look.

Instead of tweaking the Psychostats code I wrote a new application. After I had finished writing the app I then NoSQL Distilled: A Brief Guide to the Emerging World of Polyglot Persistence' target=_blank>read a good book about NoSql databases by Martin Fowler. At the end of the book he quickly covers other types of databases, one of them being an Event Sourcing databases and it occurred to me that is exactly what the CS:S log files were. The log files were events that had happened in that order in the game. My app essentially replayed these events to build up the state of the application. A fuller description of Event Sourcing can be found here. I didn’t know it at the time but I had built an app using an Event Sourcing database.

This post is a brief look at the the solution I came up with after I decided it would be easier to write an app from scratch rather than try and update the Psycostats codebase. The working version of the final application is up and running if you want to see what it does before reading how it does it. I finish the post with a summary of what I leant and

Why I didn’t tweak psyhcostats

I looked at the Psychostats code. It is written in php which is not my strong point and something I do not want to be a strong point, It required a mySql database which always cost some money and caused us problems with hosting and upgrades plus there was a process of uploading the files log files which then had to be interpreted and put the the correct database tables so the php webpages could run queries to display the webpages.

The interpretation engine was full of regex. I always forget how to regex stuff very quickly and find it hard to read. I didn’t relish trying to work out what regex I had to tweak to get the new style logs to load into the database properly. I also did not want to do a lot of re-work in a language that I personally don’t rate.

New Design Constraints

As I was starting from scratch it was a good opportunity to really think about the design. I came up with the following requirements:

  • Must be able to run without a database. That was always a pain to manage in the old system
  • Must be testable with automated unit tests
  • Must be easy to understand and maintain the code when the log files change. This really meant not using regex.
  • Must allow old logs to be interpreted side by side with new logs as and when the log format changes

Step 1 – Look at the raw log file

The most obvious place to start was the raw log file that had to be interpreted. It was interesting to see what I had to work with, every line had a timestamp, the lines ran in time order with each ‘cs:s server event’ being appended to the log file. Maps loading, players buying weapons, players shooting other players, in game chat, there were lots of things going on. The goal was to make sense of these things and display them in a nice dashboard.

Step 2 – Interpret the raw log file

I decided the first thing I needed to do was start writing something that parsed the file and populated an object I could then use in my application. I allowed the data I found in the log files drive the design of my domain objects.

I started by writing code and unit tests which picked out specific lines from the log file. After I got this working I wrote a ‘Line Processor’ for each line that was responsible to understanding what the data meant after the line had been identified. The processor extracted the data from the line and populated the appropriate domain objects.

Step 3 – Create a UI

The aim of the exercise is to create a nice dashboard for people to view the stats after a session. For this I used Sencha’s ExtJS framework. It has a very rich set of controls and excellent documentation. I delegated the layout to @naiboss and @krofunk. The UI development is what drove the requirements for the data that had to be delivered from the domain objects.

It was an interesting exercise because there were a number of ‘Server Events’ that were missing and the data had to be derived from what we had. An example of this was the ‘Successful’ bombing count, how to attribute that to a player. I solved this problem by keeping track of the last person to plant a bomb, when a round finished because of a bombing I attributed that to the personal who last planted the bomb. This required a context to be created for the Line Processors to work in so they could be aware of previous output.

Step 4 – Create a scoring system

People always like a bit of competition so @naiboss devised a scoring system so people were given points for various things, like ‘a kill’ or ‘a bombing’. The scores are weighted by weapon and bonus points given for awards, such as ‘most headshots’

The scoring system is developed against an interface allowing it to be swapped out for other scoring systems if required.

Step 5 – Mash it up - Steam Integration

As the UI and scoring system were progressing we were thinking of how we could make it easy for people to manage their profiles. The answer we came up with was to leverage their Steam profiles. The log provides the SteamId of the players so it was possible to query Steam for more information. Queries are made to get the logos and player names.

Step 6 File management – Mash it up – Dropbox integration

We were now thinking of how to allow the log files to be uploaded. As there was no database and the whole system runs on the fly from the log files I decided to look at integrating with Dropbox. A user could register their dropbox credentials and then sync their dropbox with the server to upload their log files.

This then raised another interesting possibility, how to allow sessions to be looked at in isolation or together. By using the ubiquitous analogy of file and folder management I updated the file parser to process all the log files in a folder and its sub folders. So when we had a lanparty with CS:S in the morning and afternoon we could create an ‘LanParty’ folder with AM and PM folders in it. By clicking on the LanParty folder you get the stats for all the logs on the day but by clicking on either the AM or PM folder you can see what happened in those individual sessions.

This just worked and I dread to think how hard it would have been to implement with something like the Pcyhostats database.

Closing Thoughts

It was a really interesting little project and is not quiet finished yet. The dropbox integration does not work properly yet which is essential to allow other people to use the system.

I learnt a lot as I implemented design patterns that I read in the excellent book ‘Dependancy Injection in .NET’. I have a good suite of unit tests over the file parser so I will be able to easily and safely change the parser to handle updated file formants.

C#, Amazon Web Service AWS, SES, Bulk Email »

[4 Dec 2011 | 0 Comments]

I recently did a talk at DevEvening about how I managed to using Amazon’s Simple Email Service with my (fairly) new website The Gig Market and I thought I would just share a bit more information about it if anyone was interested.

The Problem

As the website is hosted on Windows Azure there is no hosting mail server that I can use to send out emails. The answer is to use a 3rd Party service to do this for you. I had been using a company for 3 years for my other websites to do this and had been very happy with them…. until I breached the send limit one month and they  just stopped sending emails. This really annoyed me so I set about to find a better service. What I found was AWS’s SES which only costs $0.10 per 1000 emails, the problem was it is a webservice, not an smtp server, so I couldn’t just change the smtp server in my app’s settings.

The Research

What I wanted to do was to tweak my code to use SES api. To limit the impact I only wanted to change the:

SmtpClient smtp = new SmtpClient();
smtp.Send(mailMessage);

code to something that could just send the mailMessage object to Amazon SES. The problem is that out of the box the .Net MailMessage object can only be used with the .Net SmtpClient. doh.

I found a couple of other people with the same idea:

http://www.codeproject.com/KB/IP/smtpclientext.aspx

This post by Allan Eagle was a great start for me as it showed me how to get at the guts of the mail message object by using reflection. Now I only had to  send it to SES right?

http://neildeadman.wordpress.com/2011/02/01/amazon-simple-email-service-example-in-c-sendrawemail/

This post by Neil Deadman was a great article and I thought I had found my solution BUT there was a big big problem. I needed to send BCC and set the priority of the email. It turned out that I could not achieve this using the solution. So what could I do?

The Solution

I ended up hitting the Amazon docs and discovered that I would have to send a RAW message format and I would also have to write the email out by hand in MIME format. I thought this would be hard but it turned out to be pretty easy, I knocked up an extension method on the MailMessage class to do this for me.

public static string ToAmazonSesRawFormat(this MailMessage message)
        {
            var result = new StringBuilder();
            result.AppendLine("MIME-Version: 1.0");
            result.AppendLine(string.Format("From: {0}", message.From));

            if (message.To.Count > 0)
            {
                result.Append("To: ");
                int toMessageCount = 0;
                foreach (var address in message.To)
                {
                    result.Append(string.Format(
                        "{0}{1}",
                        toMessageCount == 0 ? string.Empty : ",",
                        address));

                    toMessageCount++;
                }    
            }
            
            if (message.CC.Count > 0)
            {
                result.AppendLine(string.Empty);
                result.Append("Cc: ");
                int ccMessageCount = 0;
                foreach (var address in message.CC)
                {
                    result.Append(string.Format(
                        "{0}{1}",
                        ccMessageCount == 0 ? string.Empty : ",",
                        address));

                    ccMessageCount++;
                }    
            }
            
            if (message.Bcc.Count > 0)
            {
                result.AppendLine(string.Empty);
                result.Append("Bcc: ");
                int bccMessageCount = 0;
                foreach (var address in message.Bcc)
                {
                    result.Append(string.Format(
                        "{0}{1}",
                        bccMessageCount == 0 ? string.Empty : ",",
                        address));

                    bccMessageCount++;
                }    
            }
            result.AppendLine(string.Empty);
            result.AppendLine("Subject: " + message.Subject);
            result.AppendLine(string.Format("Content-Type: {0}", message.IsBodyHtml ? "text/html;" : "text/plain;"));
            result.AppendLine(string.Format("Content-Transfer-Encoding: quoted-printable"));
             
            result.AppendLine(string.Format("X-Priority: {0}", ((int) message.Priority).ToString()));

            result.AppendLine(string.Empty);

            if (message.IsBodyHtml)
            {
                var encoder = new QuotedPrintableEncoder();
                result.AppendLine(encoder.EncodeFromString(message.Body, Encoding.ASCII));    
            } 
            else
            {
                result.AppendLine(message.Body);
            }
            
            return result.ToString();
        }

Now all I had to do was send the mesage via the SES RawMessage api:

AmazonSes.SendEmail.Instance.SendRawMessage(mailMessage);
 
The SendRawMessage function is a wrapper I put in the wrapper class I wrote for the API:
 
public void SendRawMessage(MailMessage mailMessage)
        {
            var memoryStream = new MemoryStream();
            using (memoryStream)
            {
                var encoding = new UTF8Encoding();
                var byteArray = encoding.GetBytes(mailMessage.ToAmazonSesRawFormat());

                memoryStream.Write(byteArray, 0, byteArray.Length);
                memoryStream.Position = 0;
                var message = new RawMessage(memoryStream);
                var sendRawMessageRequest = new SendRawEmailRequest(message);
                var response = Client.SendRawEmail(sendRawMessageRequest);
            }
            
            memoryStream.Close();
        }

C#, MS CRM, MS CRM4, Dynamics Crm »

[24 Jan 2011 | 1 Comments]

On my current Microsoft Dynamics Crm project we have done a lot of customisations, both creating custom pages and manipulating the existing crm pages via the OnLoad method. This post describes a method of ensuring the load order plus the benefit in increasing performance compared with loading multiple external JavaScript files the more conventional way.

The existing approach

One of the big problems we faced was making sure that the JavaScript files load in a specific order. This is because we have got some common functions in a file which can be re-used in multiple pages. These need to be loaded before the main page JavaScript file which might call one of the common functions.

There are a number of good blog post about this topic so I won’t go over this ground again. Here are a few articles I’ve found:

http://danielcai.blogspot.com/2010/02/another-talk-about-referencing-external.html
http://www.henrycordes.nl/post/2008/05/27/External-js-file-and-CRM.aspx

What is different with this new approach

The solution that I propose here was the brainchild of @njwatkins who came up with the idea of implementing at generic .net http handler as part of our custom webpages proejct that would read in the various external JavaScript files, in the correct order, compress them and then stream them back to the browsers as a single JavaScript file.

Unfortunately I cannot post the source code of the handler we use on this project here but a google round on topics like Gzip and StreadReaders and Handlers should enough to get most developers to a solution.

So the is script that goes on the onload of the Crm Enitty:

function loadJavaScript(file, onComplete) {
        var script = document.createElement('script');
        script.type = 'text/javascript';
        script.src = file;
        if (onComplete) {
            script.onreadystatechange = function() {
            if (this.readyState == 'complete' || this.readyState == 'loaded') {
                onComplete();
            }
        }
    }
    document.getElementsByTagName('head')[0].appendChild(script); 
};

loadJavaScript('/ISV/Northwind/ExternalJavaScript/contact.ashx', function() { if (EntityFormOnLoad) { EntityFormOnLoad(); } });

The contact.ashx file is the handler that does all the work and where you would define which JavaScript files to include and does the merging and compressing. It means you only need to load one external JavaScript reference and you can guarantee which order they external files will be loaded.

Other benefits of this approach is that you can control things like caching length which can be a problem when changing and deploying new external files to clients. We achieved a significant saving on load time of the javascript from 900ms down to 200ms using this approach and we have ideas on how to improve it further but this is as far as we have got today!

MS CRM4, MS CRM, C#, programming, Software Developement »

[17 Sep 2010 | 0 Comments]

Now this might seem like something that would be easy to do but I’ve just spent 2 days struggling to do just this because of what I consider a bug in one of the SDK wrappers. I have now found a work around to enable unit testing which I will share with you now.

The Error message

Test method XrmEntityWrappers.Tests.CaseEntity.GetCaseByTicketNumber threw exception:  System.TypeInitializationException: The type initializer for 'Microsoft.Xrm.Client.Caching.Cache' threw an exception. --->  System.IO.DirectoryNotFoundException: Could not find a part of the path 'appDomain=UnitTestAdapterDomain_ForC:\Projects\Thg.Ohov.Crm\SourceCode\Thg.Ohov.Crm\TestResults\dh27_WIN-51UPWVCUQ6V 2010-09-16 18_21_40\Out\XrmEntityWrappers.Tests.dll:key=Microsoft.Xrm.Client.Caching.InMemoryCacheProvider'..

System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
b__0(Object userData)
System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData)
System.Threading.Mutex..ctor(Boolean initiallyOwned, String name, Boolean& createdNew, MutexSecurity mutexSecurity)
System.Threading.Mutex..ctor(Boolean initiallyOwned, String name)
Microsoft.Xrm.Client.Threading.MutexExtensions.Lock(String key, Int32 millisecondsTimeout, Action`1 action)
Microsoft.Xrm.Client.Threading.MutexExtensions.Get[T](String key, Int32 millisecondsTimeout, Func`2 loadFromCache, Func`2 loadFromService)
Microsoft.Xrm.Client.Threading.MutexExtensions.Get[T](String key, Int32 millisecondsTimeout, Func`2 loadFromCache, Func`2 loadFromService, Action`2 addToCache)
Microsoft.Xrm.Client.Threading.MutexExtensions.Get[T](String key, Func`2 loadFromCache, Func`2 loadFromService, Action`2 addToCache)
Microsoft.Xrm.Client.Caching.InMemoryCacheProvider.GetExtendedCache()
Microsoft.Xrm.Client.Caching.CacheManager.GetExtendedCache()
Microsoft.Xrm.Client.Caching.Cache..cctor()
Microsoft.Xrm.Client.Caching.Cache.Get[T](String label, Func`2 load)
Microsoft.Xrm.Client.CrmConnection..ctor(String connectionStringName, String connectionString)
Microsoft.Xrm.Client.CrmConnection.Parse(String connectionString)
Thg.Ohov.Crm.Core.XrmEntityWrappers.XrmAdapter..ctor() in C:\Projects\Thg.Ohov.Crm\SourceCode\Thg.Ohov.Crm\Core\XrmEntityWrappers\XrmAdapter.cs: line 28
Thg.Ohov.Crm.Core.XrmEntityWrappers.incident.get_XrmAdapter() in C:\Projects\Thg.Ohov.Crm\SourceCode\Thg.Ohov.Crm\Core\XrmEntityWrappers\incident.cs: line 25
Thg.Ohov.Crm.Core.XrmEntityWrappers.incident.GetIncident(String caseId) in C:\Projects\Thg.Ohov.Crm\SourceCode\Thg.Ohov.Crm\Core\XrmEntityWrappers\incident.cs: line 44
XrmEntityWrappers.Tests.CaseEntity.GetCaseByTicketNumber() in C:\Projects\Thg.Ohov.Crm\SourceCode\Thg.Ohov.Crm\XrmEntityWrappers.Tests\CaseEntity.cs: line 23

The reason for the error

The Microsoft.Xrm.Client.dll tries to create a Mutex object with the

Thread.GetDomain().FriendlyName;

When running ordinary in a console app or web app this is not a problem as the FriendlyName does not contain any ‘\’ characters. However UnitTest frameworks do put ‘\’ characters in the GetDomain().FriendlyName which then causes the Mutex object to throw a ‘System.IO.DirectoryNotFoundException’.

The fix

The real fix is for Microsoft to update the Microsoft.Xrm.Client.dll so that it doesn’t put any ‘\’ characters into the Mutex constructor. However my work around for this is thanks to Nick Watkins who found this article on how to change the GetDomain().FriendlyName

http://www.timvasil.com/blog14/post/2008/11/Fixing-Instance-names-used-for-writing-to-custom-counters-must-be-127-characters-or-less.aspx

The key bit of code being this if you want to set the FriendlyName to ‘Test’ (which doesn’t have any ‘\’ characters!):

typeof(AppDomain).GetMethod("nSetupFriendlyName", BindingFlags.NonPublic | BindingFlags.Instance).Invoke(AppDomain.CurrentDomain, new object[] { "Test" });

To rename the GetDomain().FriendlyName before calling any of the wrapper code in the unit tests. So the test might look a bit like this:

[TestMethod()]
        public void GetIncidentTest()
        {

typeof(AppDomain).GetMethod("nSetupFriendlyName", BindingFlags.NonPublic | BindingFlags.Instance).Invoke(AppDomain.CurrentDomain, new object[] { "Test" });

            string caseId = "1234"; // TODO: Initialize to an appropriate value
            incident expected = null; // TODO: Initialize to an appropriate value
            incident actual;
            actual = incident.GetIncident(caseId);
            Assert.AreEqual(expected, actual);

}

Summary

I’m happy now I can unit test my custom code that uses the xRM Wrappers and I hope that my support call with Microsoft will result in the SDK dll being updated.

C#, charity, hacking, paypal »

[22 Sep 2009 | 5 Comments]

I signed up for this Paypal event 5 weeks ago when I first saw it on Twitter - CharityHack. I thought it was going to be a couple of days of workshops where Paypal show developers how to use their new Adaptive Payments API…. how wrong was I!

The penny only dropped More...

ASP.net, MVC, C#, Software Developement »

[11 Mar 2009 | 6 Comments]

I have managed to arrange Ian Crowther, an ex-colleague from Avanade, to come and do a brown bag session for me and my employees this Saturday 14th March at my office near Haslemere in the UK.

Ian has been working a lot with Microsoft’s MVC.net and Yahoo UI recently. He is going to give a presentation and then run a practical coding workshop on Microsoft’s MVC.net and Yahoo UI showing how to combine them to More...

ASP.net, BlogEngine.NET, C#, Software Developement »

[6 Feb 2009 | 12 Comments]

I have been using Blogengine.net for over a year now and have thought it is a great application. In this post I will tell you how I have successfully tweaked it to deliver a lightweight CMS system that I required. (I have published the source code at the bottom of this article)

At the time of writing I have successfully implemented a few websites powered completely with Blogengine.net:

http://www.seethelink.co.uk

http://www.petersfieldparish.org.uk

http://www.tonytinman.co.uk

http://www.marvelav.com

More...

C#, Software Developement »

[11 Oct 2008 | 2 Comments]

I was told about a new feature in C# 3.0 called extension methods back in 2007 sometime by the technical architect at my old company. He explained that Microsoft had created it to enable LINQ and that it allows you extend base classes with your own functions, which is pretty cool. But what's a useful example?

Peter (my TA) said More...

C#, Dataloading, Excel, Xml »

[25 Apr 2008 | 0 Comments]

Have you ever been given a spreadsheet to import into some system or other?

Have you then tried creating a simple dataloading program to read the data, run some dataloading rules against it before loading it into the correct fields on the target system.

Have you found this has caused problems in code because:

A) Everything is a string
B) It is hard to access the correct spreadsheet columns in code be because of a lot of column index numbers being used.
C) The Excel runtime must be installed on the computer running the data loading tool.

If you have shared this pain and have not found a better solution then please read on.
More...