tag:blogger.com,1999:blog-34150403926144863582024-03-12T19:02:44.848-07:00Zignals blogTrading SignalsDeclan (@zignals.com)http://www.blogger.com/profile/11340587089163195922noreply@blogger.comBlogger6125tag:blogger.com,1999:blog-3415040392614486358.post-2565389690057824452008-08-20T05:12:00.000-07:002008-08-20T05:18:28.632-07:00The cost of a character...<span class="fullpost"></span>Having received some feedback from one of our <a href="http://www.zignals.com/">members</a> that our new version of our <a href="http://www.zignals.com/main/charts/trystockcharts.aspx">Stock Charts</a> had a problem of sliding menus sticking, and given the absence of the lead Silverlight developer, it fell to me to investigate and solve this strange problem.<br /><br />A real head-scratcher this one since this worked fine on our development machines, and our test and staging environment (the majority of bugs are picked off from our test servers, a few unusual ones may creep through to the staging servers, before giving the all-clear to be released live!). This made it a tough problem to diagnose (since you can’t debug something that is essentially “working” on your development pc!). What made it even more difficult is that the lead Silverlight developer on the charting project is on holidays (always the time when something goes wrong!). Even more interesting about the problem was that in both “fullscreen” Silverlight mode and when the “Annotations” panel was pinned, the slider sticking problem simply went away! Two modes completely unrelated to the menus appeared to “fix” the problem!<br /><br />To make a long story (in my case a two day trawl through thousands of lines of code) short, it turned out that during an asynchronous, looping call to an animation (the “Slide” in question) the code was stepping through the animation at intervals of 29px and was trying to tell the main class that the animation was over when the animation reached or exceeded the maximum height of the panel. In our case, with the new version of the Charts, the maximum height happened to by 377px. Now purely coincidentally, 29 happens to divide into 377 exactly 13 times, so when the line of code read something like “if x > y, finish animation” then because 377 is not greater than 377 it kept going... going nowhere! So a simple addition of an “=” character to read “while x >= y, finish animation” solved the problem.<br /><br />Such a simple solution, but unfortunately to a problem buried deep in code that would work perfectly on a developer’s pc and only go wrong when the height of the Silverlight application was constrained to a multiple of 29, making it a needle in a haystack to find.<br /><br /><span style="font-size:10;color:#cccccc;">Scott Tattersall, Chief Technology Officer, <a href="http://www.zignals.com/">Zignals.com</a> the free stock alerts, market alerts, and stock charts website </span>Scotthttp://www.blogger.com/profile/14820723276683152547noreply@blogger.com0tag:blogger.com,1999:blog-3415040392614486358.post-34786491724575714002008-05-01T09:08:00.000-07:002008-05-19T02:49:04.681-07:00Semantic web vs Actual Web<span style="font-family:verdana;"><span>Back in 2001, I read an article on the </span><a href="http://www.sciam.com/article.cfm?id=the-semantic-web&page=1"><span>Semantic Web</span></a><span> that envisaged a future web where the semantics and context of various web resources were contained in the meta-data of these resources, making it easier/possible for autonomous agents to browse the web on a user’s behalf. It seemed like an unachievable web Utopia at the time, but the logic of the arguments was nonetheless undeniable. </span></span><br /><span style="font-family:verdana;"><span></span></span><br /><span style="font-family:verdana;"><span>What has me writing this post is the thought that perhaps the “Actual web” is gradually evolving into the “Semantic web” all on its own, without needing any of the suggested frameworks proposed in the original paper. It’s not the idealised version originally envisaged, but there are a number of mitigating factors that I think support this premise: </span></span><span class="fullpost"><br /><span style="font-family:verdana;"><span><ul><li>The independent growth of various technologies (e.g. Google’s unbeatable search capabilities, and web page scanning technology such as OpenKapow) </li><li>The ubiquity of social websites and user generated content </li><li>The growing phenomenon of the “mashup” </li><li>The “open standards” being proposed (and adopted) by industry giants, including </span><a href="http://code.google.com/apis/opensocial/"><span>OpenSocial</span></a><span>, </span><a href="http://www.apml.org/"><span>APML</span></a><span>, </span><a href="http://openid.net/"><span>OpenID</span></a><span><br /></li></ul></span><span>All of that leads to a situation where we have open languages for describing things on social sites (OpenSocial/AMPL) that can be “consumed” by mashup creators and “published” as web services that are accessible, and “understandable”, by any agent capable of reading XML.<br /><br /></span><span></span><span>So perhaps it’s the evolution of the agent that is lagging behind? </span><br /><span></span><br /><span>For example, a well programmed agent that is asked to about the “weather” by a user, might do a search on Google for “weather web service wsdl”, which yields the following url in the very first search result: </span><a href="http://www.weather.gov/forecasts/xml/DWMLgen/wsdl/ndfdXML.wsdl"><span>http://www.weather.gov/forecasts/xml/DWMLgen/wsdl/ndfdXML.wsdl</span></a><span> which is consumable by a web service enabled “agent” that would, in theory, be able to check the weather at a given location (perhaps based on the location given in the user's social profile gained from APML/OpenSocial), which is presumably what a user would expect when asking about the weather. </span><br /><span></span><br /><span>So </span><span>is it just that nobody has built a good enough agent yet? <br /><br /><span style="font-size:80%; color:#cccccc;">Scott Tattersall is lead developer of stock alerts, stock charts, and market sentiment for <a href="http://www.zignals.com">Zignals</a></span></span></span></span></span></span><br /></span></span>Scotthttp://www.blogger.com/profile/14820723276683152547noreply@blogger.com0tag:blogger.com,1999:blog-3415040392614486358.post-7149468960177340302008-04-25T05:23:00.000-07:002008-05-19T02:48:30.056-07:00Trouble with Silverlight/WCF deploymentWe’ve been having a lot of trouble moving Silverlight projects from development machines to production servers recently. This is especially true of new Silverlight beta 2 projects using WCF as the data communication layer. The problem seems to be when we try and move from a VS2008 development environment (where the WCF service runs by default off the in-built VS web server) to an IIS based production server. We don’t want to be running a temporary web server on a production environment (indeed we normally wouldn’t even have one as the development environment would not be installed) and we can’t link the Silverlight project to the “localhost” implementation of the WCF service.<br /><br />So we have to change the addressing of the Silverlight web service from localhost to zignals.com and we have to run the web service from the IIS rather than the temporary web server it would normally run from when you run the project on the development machine. These changes have led to us seeing a lot of errors involving:<span class="fullpost"><br /><strong>1) Endpoints mismatches<br />2) Cross domain security<br />3) IIS configuration issues<br />4) References not updating<br /></strong><br />We haven’t solved all the above consistently, but through a lot of trial and error we have got our projects deployed online. The things to look out for:<br />1) Register .xap files as MIME types in IIS<br />2) Modify the projects to all use the IIS server on the developer machines instead of the default web server<br />3) Put the WCF service on the deployment server, and then modify the web reference in Silverlight to point to the deployment server instead of localhost, then add the Silverlight project to the deployment server<br />4) Modify the hosts file on the deployment server to point your domain name to 127.0.0.1 so you can use the project on the server as well as remotely<br />5) Use clientaccesspolicy.xml to avoid cross domain issues<br />6) When re-referencing to a new web service, you should do a search through all files in the project for the original reference (e.g. “localhost”). We had a problem where some .config files didn’t auto-update and we had to manually change this reference.<br /><br />Needless to say, we aren’t the only ones facing deployment problems with 2.0 beta, so here are some links to others solutions:<br /><a href="http://weblogs.asp.net/tolgakoseoglu/archive/2008/03/18/silverlight-2-0-and-wcf.aspx"><span style="font-size:85%;">http://weblogs.asp.net/tolgakoseoglu/archive/2008/03/18/silverlight-2-0-and-wcf.aspx</span></a><br /><a href="http://silverlight.net/forums/p/10852/34477.aspx"><span style="font-size:85%;">http://silverlight.net/forums/p/10852/34477.aspx</span></a><br /><a href="http://archdipesh.blogspot.com/2008/02/how-do-i-deployhost-wcf-service-on.html"><span style="font-size:85%;">http://archdipesh.blogspot.com/2008/02/how-do-i-deployhost-wcf-service-on.html</span></a><br /><br /><span style="font-size:80%; color:#cccccc;">Scott Tattersall is lead developer of stock alerts, stock charts, and market sentiment for <a href="http://www.zignals.com">Zignals</a></span></span></span></span></span>Scotthttp://www.blogger.com/profile/14820723276683152547noreply@blogger.com0tag:blogger.com,1999:blog-3415040392614486358.post-23196819939142522522008-04-22T05:46:00.001-07:002008-06-13T06:39:28.254-07:00Performance of Web ApplicationsThere are a lot of things that can be done to a web application, both on the front end and on the back end, to make it run faster, use less bandwidth and utilise less server processing power. In some cases the results can be dramatic to the end user, and in other cases the results can be dramatic to the number of concurrent users a server/bandwidth will support. In all cases it is a very useful and productive exercise. We spent quite a lot of time trawling around the web and various forums and user groups trying to put together a standard list of performance enhancements that have a low overhead for implementation, are repeatable and easy for developers to put into general practice without headache, and which produce measurable improvements.<span class="fullpost"> I’ll summarise the pared down list here, but for anyone keen to extract every last processing cycle from their servers, I’ve listed a “top 10” set of links at the end of this article to some of the performance articles we found most illuminating, all of which contain very useful information. And although it’s not aimed at .NET applications specifically, if you just pick one to explore, I really recommend the yahoo performance best practices list (<a href="http://developer.yahoo.com/performance/rules.html">http://developer.yahoo.com/performance/rules.html</a> )<br /><br /><strong>JavaScript single file combination/minification<br /></strong>Replace multiple JavaScript files on a web page with one large (minified) file. Copy every JavaScript file that is used for a particular webpage into one single master JavaScript file called say “AllScript.js”. Then replace the "script" tags that reference these files with one single script reference for the “AllScript.js” file.<br /><br />This script reference should be placed as close as possible to the bottom of the webpage so that visual content is loaded first without js files slowing down the loading of content such as css files and images/media. All JavaScript within a page should be transferred to the external JS file.<br />The master JavaScript file can be further minified be using a handy program called JSMIN (<a href="http://www.crockford.com/javascript/jsmin.html">http://www.crockford.com/javascript/jsmin.html</a>) which removes whitespace from input file and returns minified file on output. This minified version can be referenced just the same by referencing it like “AllScript_min.js” at bottom of aspx file. This file can also be referenced in the asp:ScriptManager/ToolScriptManager control by setting the ScriptReference tag's Path attribute to the path of the file. If the attribute LoadScriptsBeforeUI is set to false then any referenced JS files are placed at the bottom of the webpage when rendered.<br /><br />Using the Firebug tool for Firefox ([link]) we can inspect all the JS files that are requested and downloaded to the browser as the page runs. When the AjaxControlToolkit is used the client-side JS files that it uses are named “ScriptResource.axd” are dynamically referenced and downloaded to the browser. This results in a large number of separate requests (which we want to avoid) so an option exists where these files can be combined into one single HTTP request. This can be done by setting the CombineScripts attribute on the ToolScriptManager control to true. ToolScriptmanager inherits from the ScriptManager control so it is fine to substitute for the ScriptManager control in aspx pages. (link)<br /><br /><strong>CSS single file combination/minification</strong><br />CSS files should be referenced in the head section of the HTML/aspx page as we want the visual to load before the script files. Similar to the above JS single file combination/minification process, we can combine all referenced CSS files required for a particular page into a single master CSS file called say “AllCSS_min.css” and just reference this in the "link" tag inside the header. The CSS files can be simply copy/pasted into master CSS file and a tool called CSSMIN (<a href="http://weblogs.asp.net/zowens/archive/2008/02/15/improve-asp-net-performance-cssmin.aspx">http://weblogs.asp.net/zowens/archive/2008/02/15/improve-asp-net-performance-cssmin.aspx</a>) minifies these into one single CSS file.<br /><br /><strong>IIS 6.0 compression<br /></strong>Enabling compression is a must. On IIS 6.0 (e.g. windows 2003 server) compression of files are on by default for static compression. To allow for dynamic compression this can be activated by running a script or activating it through IIS 6.0. The below reference explains the procedure.<br /><a href="http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true">http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true</a><br /><br /><strong>CSS sprites/multiple image combination<br /></strong>For each image referenced for a particular page in your website a separate HTTP request is issued to download it to the browser. If you have a large number of images this can use a lot of bandwidth and is not so efficient.<br /><br />Images can instead be combined into one larger image and this can be downloaded as one HTTP request. On the browser sections of the webpage that want to display an image such as elements can all reference the same master images by setting the src attribute to for example “sprites1.png” and supply offsets for the background-position property to “pick out” the image required from the master image. This process is made easier by a 3rd party tool called CSS Sprites Generator <a href="http://www.csssprites.com/">http://www.csssprites.com/</a>. Simply upload all the images used on a particular webpage and hit generate and this website automatically combines all images into one single image supplying offsets as shown:<br /><br />.info {background-image:url(sprites1.png); <a href="http://bp3.blogger.com/_wGdKfLZatBU/SA4QSg7xKzI/AAAAAAAAABk/Bkg1G0IyBR4/s1600-h/examplecsssprites1.PNG"><img id="BLOGGER_PHOTO_ID_5192105330669071154" style="FLOAT: right; MARGIN: 0px 0px 10px 10px; CURSOR: hand" alt="" src="http://bp3.blogger.com/_wGdKfLZatBU/SA4QSg7xKzI/AAAAAAAAABk/Bkg1G0IyBR4/s320/examplecsssprites1.PNG" border="0" /></a><br />background-position:-66px -66px;<br />}<br />.lightning {<br />background-image:url(sprites1.png);<br />background-position:-66px -246px;<br />}<br />.magnify {<br />background-image:url(sprites1.png);<br />background-position:-66px -510px;<br />}<br /><br />NOTE: Elements that use the CSS attribute repeat for images cannot be used in this process. Also animated .gifs will not work either.<br /><br />And to add new images to the master image you need to upload all the previous images in the master image in the same order again to keep same offset values. It is important to keep a record of the offsets for each image within the master image for reference.<br /><br /><strong>Web.config/Machine.config optimal settings<br /></strong>For production websites, it’s important to remember to set the <compilation debug="”false”">setting in Web.config. This ensures no unnecessary debug code is generated for release version of website.<br />If you are not using some of the asp.net modules such as Windows Authentication or Passport Authentication etc then these can be removed from the asp.net processing pipeline as they will be unnecessarily loaded otherwise. Below is an example of some modules that could be removed from the pipeline:<br /><br />ASP.NET Process Model configuration defines some process level properties like how many number of threads ASP.NET uses, how long it blocks a thread before timing out, how many requests to keep waiting for IO works to complete and so on. With fast servers with a lot of RAM, the process model configuration can be tweaked to make ASP.NET process consume more system resources and provide better scalability from each server.<br /><br />The below settings can help performance:<br /><processmodel maxappdomains="2000" asyncoption="20" miniothreads="30" minworkerthreads="40" maxiothreads="100" maxworkerthreads="100" responserestartdeadlockinterval="00:03:00" responsedeadlockinterval="00:03:00" memorylimit="60" restartqueuelimit="10" requestqueuelimit="5000" requestlimit="Infinite" shutdowntimeout="00:00:05" idletimeout="Infinite" timeout="Infinite" enable="true"><br /><strong>Caching 3rd Party data & generated images </strong><br />If you are acquiring data from 3rd party sites (e.g. RSS feeds, mashup data, web services, etc) then it can be a good idea to cache this data for short periods (depending on how “real-time” the data needs to be). It can make a significant difference in page loading time when there are many remote requests for this sort of data. In our case for example, we allow users to specify RSS feeds that they are interested in monitoring. Since many users can specify the same popular feeds, we can cache the RSS data returned from remote site as XML and store it in the Database for a short period (e.g. 10mins). By doing this only the first person to request the RSS feed will have to experience the delay whereby our server has to send off a request to the remote server where the RSS data resides. All subsequent users during the cache period will receive their data directly from our Cache, negating the latency and bandwidth requirements associated with contacting the remote server.<br /><br />We also use a 3rd party charting control that generates an image (.png/.jpeg) on the server when it creates a chart. We cannot cache these images where users specify user-specific parameters to generate them, but when images are generated which are the same for each user, (e.g. default chart images that only update on a daily basis), then we can cache them for 1 day and avoid the expensive process of recreating this chart image every time a user requests one of these “default” type images.<br /><br /><strong>Further Reading:<br /></strong><span style="font-size:85%;">1) </span><a href="http://developer.yahoo.com/performance/rules.html"><span style="font-size:85%;">http://developer.yahoo.com/performance/rules.html</span></a><span style="font-size:85%;"><br />2) </span><a href="http://msdn2.microsoft.com/en-ie/magazine/cc163901(en-us).aspx"><span style="font-size:85%;">http://msdn2.microsoft.com/en-ie/magazine/cc163901(en-us).aspx</span></a><span style="font-size:85%;"><br />3) </span><a href="http://msdn2.microsoft.com/en-ie/magazine/cc163854(en-us).aspx"><span style="font-size:85%;">http://msdn2.microsoft.com/en-ie/magazine/cc163854(en-us).aspx</span></a><span style="font-size:85%;"><br />4) </span><a href="http://msmvps.com/blogs/omar/archive/2007/03/16/asp-net-ajax-in-depth-performance-analysis.aspx"><span style="font-size:85%;">http://msmvps.com/blogs/omar/archive/2007/03/16/asp-net-ajax-in-depth-performance-analysis.aspx</span></a><span style="font-size:85%;"><br />5) </span><a href="http://msdn2.microsoft.com/en-us/library/ms998549.aspx"><span style="font-size:85%;">http://msdn2.microsoft.com/en-us/library/ms998549.aspx</span></a><span style="font-size:85%;"><br />6) </span><a href="http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx"><span style="font-size:85%;">http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx</span></a><span style="font-size:85%;"><br />7) </span><a href="http://www.maxkiesler.com/index.php/comments/decrease_load_time_and_increase_roi_in_web_20_and_ajax_sites/"><span style="font-size:85%;">http://www.maxkiesler.com/index.php/comments/decrease_load_time_and_increase_roi_in_web_20_and_ajax_sites/</span></a><span style="font-size:85%;"><br />8) </span><a href="http://www.testingfaqs.org/t-load.html"><span style="font-size:85%;">http://www.testingfaqs.org/t-load.html</span></a><span style="font-size:85%;"><br />9) </span><a href="http://visualstudiomagazine.com/features/article.aspx?editorialsid=1748"><span style="font-size:85%;">http://visualstudiomagazine.com/features/article.aspx?editorialsid=1748</span></a><span style="font-size:85%;"><br />10) </span><a href="http://msdn2.microsoft.com/en-us/library/ms973839.aspx"><span style="font-size:85%;">http://msdn2.microsoft.com/en-us/library/ms973839.aspx</span></a><br /><br /><span style="font-size:80;color:#cccccc;">Scott Tattersall is lead developer of stock alerts, stock charts, and market sentiment for <a href="http://www.zignals.com/">Zignals</a></span></span></span></span><br /></span>Scotthttp://www.blogger.com/profile/14820723276683152547noreply@blogger.com0tag:blogger.com,1999:blog-3415040392614486358.post-7810875974624160962008-04-16T01:44:00.000-07:002008-05-19T02:47:13.003-07:00Visual Studio project structure for large web applications<strong>.NET DLLs:</strong><br /><strong></strong><br />Having functionality encapsulated within .net dll projects is a vital part of any web application, and I recommend starting all new web applications with this in mind. The project structure we use in Zignals is:<br /><br /><br /><img id="BLOGGER_PHOTO_ID_5189765387563947426" style="DISPLAY: block; MARGIN: 0px auto 10px; WIDTH: 406px; CURSOR: hand; HEIGHT: 212px; TEXT-ALIGN: center" height="224" alt="" src="http://bp3.blogger.com/_wGdKfLZatBU/SAXAH5ld1aI/AAAAAAAAABM/4yW9zsx380Q/s400/projectarchitecture.PNG" width="425" border="0" /> <strong>Framework.dll :</strong><br />This stores all the low level classes and functions that will be common to all web projects. For example, our framework has:<span class="fullpost"><br /><br />Session.cs: Handles all web application Session logic<br />Logging.cs: Handles all error and information logging<br />SqlWrapper: Handles all low-level interaction with the database<br />ZUtilities: All utility functions<br />Security: All encryption & security functions<br />Etc<br /><br />Basically anything that is project independent. If you are building just one large web application (as in our case), you could have your business logic in this dll to save hassle with multiple references/namespaces, but if you want to re-use your common functions across multiple sites you should have a separate project for the business logic.<br /><br /><strong>BusinessLogic.dll:</strong><br />Holds all the functionality common among the various Visual Studio projects that are build for the current website. Having your business logic here allows developers of different project types (e.g. windows services, web services, web applications) use the same underlying business logic. In our case for example, we allow a user to Simulate an investment strategy online (web application). We also have a complex algorithm that automatically builds a strategy. Because of processing requirements, this algorithm runs as part of a windows service that allows processing to queue while waiting for a free CPU. Once the strategy is auto-built, it also needs to be simulated over a historical time period to assess performance. Since we have the “SimulateStrategy” function in the Strategy class in the BusinessLogic.dll, both the Windows Service project and the Web Application project can simple reference the BusinessLogic.dll project output and always use the most up-to-date version.<br /><br /><strong>CustomWebControls.dll:</strong><br />This project holds any ASCX files that we use. The reason for having these in a separate project is so we can have developers building controls independently of the developer consuming them. It also means that we can re-use these controls across multiple web projects. It also allows us to dynamically add these controls from cs code. For a detailed look at creating User Control Libraries, see <a href="http://webproject.scottgu.com/CSharp/UserControls/UserControls.aspx">http://webproject.scottgu.com/CSharp/UserControls/UserControls.aspx</a><br /><br /><strong>CustomControls.dll:</strong><br />We have a custom controls project for any custom objects that we want to be able to store in the database (e.g. serialized into our DB cache). The reason for a separate project and solution is because of the fact that recompilation of a dll will create a different signature for the object and you won’t be able to deserialise objects stored in the database after re-compilation. Since you re-compile your web projects for every new line of code, this would make serializing and storing objects in the database impossible, hence the CustomControls library. Incidentally, we generally only have very basic objects in this library so there is very little call for recompilation.<br /><br /><strong>WebApplication 1:</strong><br />Whatever web application you would normally have produced.<br /><br /><strong>Web Application 2:</strong><br />If required. We have multiple web applications because of the division of labour (e.g. different developers or development teams can have “autonomous” responsibility for their own project), and because having hundreds of aspx files or directories in one project is cumbersome.<br /><br />All these projects can be opened as part of one visual studio solution and you can set a build order that makes the framework compile first, then the business logic, a<a href="http://bp3.blogger.com/_wGdKfLZatBU/SAW_I5ld1ZI/AAAAAAAAABE/t-7pbIliDgY/s1600-h/ProjectStructureNew.PNG"></a>nd so on all the way up to the web applications.<br /><a href="http://bp2.blogger.com/_wGdKfLZatBU/SAXCwpld1bI/AAAAAAAAABU/NjJjdrPqIYY/s1600-h/ProjectStructureNew.PNG"><img id="BLOGGER_PHOTO_ID_5189768286666872242" style="FLOAT: right; MARGIN: 0px 0px 10px 10px; CURSOR: hand" alt="" src="http://bp2.blogger.com/_wGdKfLZatBU/SAXCwpld1bI/AAAAAAAAABU/NjJjdrPqIYY/s320/ProjectStructureNew.PNG" border="0" /></a><br />In the example opposite, taken from our main VS solution, you can see the Framework and BusinessLogic projects and the 2 web projects (Dashboard and ZignalsTools).<br /><br />The CustomWebControls project is called WidgetControls for our site.<br /><br />You will also need to copy all the .ascx files for the CustomWebControls project into a directory of which ever web applications are using them. This is done by setting the pre-build event of the web project consuming the CustomWebControls to something like:<br /><br />copy $(SolutionDir)\CustomWebControls\*.ascx $(ProjectDir)\UserControls\<br /><br />You will also want to make sure that each project higher up the chain has a reference to the project output of the projects lower in the chain. E.g. BusinessLogic.dll has a reference to the project output of Framework.dll. This is done by right-clicking on “Add Reference” in the business logic project and selecting the “Projects” tab in the dialog box and selecting the relevant project.<br /><br /><img id="BLOGGER_PHOTO_ID_5189764107663693186" style="DISPLAY: block; MARGIN: 0px auto 10px; CURSOR: hand; TEXT-ALIGN: center" alt="" src="http://bp1.blogger.com/_wGdKfLZatBU/SAW-9Zld1YI/AAAAAAAAAA8/84cA5Vabz9U/s400/AddReference.PNG" border="0" /> Visual Studio will automatically organise the project build order based on the project references within the solution, so you shouldn't need to do anything else to get the projects to build correctly.<br /><br /><span style="font-size:80%; color:#cccccc;">Scott Tattersall is lead developer of stock alerts, stock charts, and market sentiment for <a href="http://www.zignals.com">Zignals</a></span></span></span></span>Scotthttp://www.blogger.com/profile/14820723276683152547noreply@blogger.com0tag:blogger.com,1999:blog-3415040392614486358.post-22206282218620328522008-04-14T03:28:00.000-07:002008-07-31T06:13:58.065-07:00Scalable Caching with .NETWeb application performance can be greatly increased by caching frequently used data that would usually come from the database. This performance enhancement is further exaggerated if the web application has to perform some complex operations on the raw data prior to displaying it (financial calculations, graphing, etc) or if the data is coming from a 3rd party data source, such as an RSS feed from another site, or an XML web service call, where the network latency associated with acquiring the data can be a significant delay.<br /><br />From a programming perspective, there are a number of different ways of caching data available in .NET that can be useful for different purposes. There are the cache classes:<span class="fullpost"><br /><br />“HttpRuntime.Cache”<br />and<br />“HttpContext.Current.Cache”<br /><br />that can be used to cache objects. There are minor differences between the two, but a good argument for using the former can be found here:<br /><a href="http://weblogs.asp.net/pjohnson/archive/2006/02/06/437559.aspx">http://weblogs.asp.net/pjohnson/archive/2006/02/06/437559.aspx</a><br /><br />So, now you have your in-built caching class, what’s wrong with calling:<br /><br />HttpRuntime.Cache.Insert(“SomeKey”, someObject)<br /><br />for all your cacheable web objects?<br /><br />The problem lies in what happens when you move your application from a single web server environment to a dual server, multi-server or web farm environment. The output of data requests made to one server will be cached on that server, but there is no guarantee that the next request made for the same data will be made to the same web server in the web farm, which will mean another trip to the database and a re-caching of the data on the new server, and so on:<br /><br /><a href="http://bp2.blogger.com/_wGdKfLZatBU/SAMydJld1SI/AAAAAAAAAAM/cPlfnAh1_ZA/s1600-h/diagram1.PNG"><img id="BLOGGER_PHOTO_ID_5189046672031601954" style="CURSOR: hand" alt="" src="http://bp2.blogger.com/_wGdKfLZatBU/SAMydJld1SI/AAAAAAAAAAM/cPlfnAh1_ZA/s320/diagram1.PNG" border="0" /></a><br />Our solution is to have 2 levels of caching, the ASP.NET memory cache implemented using the HttpRuntime.Cache class and a DBCache, which serialises objects and stores them in our Cache database.<br /><br />By having a dual layer approach, we can access the raw data in our database for the first time we use the data object, then we can add the resulting object into the DBCache and the MemoryCache. If the users next request for the same object happens on the same webserver as before, they get the object directly from the IIS inProc cache (HttpRuntime). If the same request happens on a different web server in the server farm, the users request will come pre-generated from a de-serialisation of the object in the DBcache (faster than accessing the raw data tables in the main database and re-computing the object). This web servers Application cache is now populated with the same data object so next time it is requested at this webserver the response is direct from the memory cache.<br /><br />We generally use this dual caching for generated objects shared by many users. This means that the first user request will be cached and all further requests by any user will be served from the caches.<br /><br />An example in our case would be for when a user requests a 14 Day moving average of Microsoft stock (MSFT) for the last 5 years. The first request takes the raw Price data from our Prices table and computes an array of doubles representing the MA value for each day over the last 5 years. It is very likely that another user will want to calculate the same values (or or portion thereof, but how we handle that from our cache is a different story!) so we serialise the double array and store it in the caches. A subsequent request for the same calculation will not require a trip to the large Price data table or any computation, the only question is whether or not the request is fulfilled by the IIS cache on the web server or from our DB cache.<br /><br /><a href="http://bp1.blogger.com/_wGdKfLZatBU/SAM1H5ld1WI/AAAAAAAAAAs/HjFGLjWa_Q0/s1600-h/diagram2.PNG"><img id="BLOGGER_PHOTO_ID_5189049605494265186" style="CURSOR: hand" alt="" src="http://bp1.blogger.com/_wGdKfLZatBU/SAM1H5ld1WI/AAAAAAAAAAs/HjFGLjWa_Q0/s400/diagram2.PNG" border="0" /></a><br />We store all the DB Cache data on our “CacheServer”, which is a separate physical server running a copy of SQL Server (i.e. independent of our main SQL Server). Incidentally, you don’t need a full enterprise edition of SQL Server for the Cache server as SQL Express edition has more than enough power and capacity for the needs of the cache, and it’s free.This approach also has the added benefit of allowing us to persist our cache for as long as we want. The cache is not destroyed when a web server restarts for example, and we can add on additional web servers very easily, with the knowledge that they will have instant access to a long history of cached objects generated from the web applications that have been running on the other web servers.<br /><br /><strong>LRU Policy:</strong><br /><br />Finally, we have an LRU policy for the DB Cache (the LRU policy is natively implemented by the IIS cache, though it’s not obvious from the documentation). We have a cache monitoring service that runs on our cache server and will automatically remove any items that are past their “expiry” date. Upon addition of new items, if there is not enough “room” in the cache, then the least recently used item is removed from the cache. The LRU policy on the SQL server cache is handled by storing the keys in the cache table ordered by how recently they were used. We have a column in the cache table that is always ordered from least recently used to most recently used. E.g. Upon accessing a row in the cache table:<br /><br /><a href="http://bp2.blogger.com/_wGdKfLZatBU/SAMzIJld1UI/AAAAAAAAAAc/y38OZYGzxYw/s1600-h/diagram3.PNG"><img id="BLOGGER_PHOTO_ID_5189047410765976898" style="CURSOR: hand" alt="" src="http://bp2.blogger.com/_wGdKfLZatBU/SAMzIJld1UI/AAAAAAAAAAc/y38OZYGzxYw/s320/diagram3.PNG" border="0" /></a><br />So, removing the least recently used item means deleting the row at position 1, and decrementing the LRU column in remaining rows by 1. (In practice when we hit an LRU operation, we delete a large number of rows to prevent us constantly having to update rows on insertion of new items).<br /><br /><strong>Further Reading:</strong><br /><a href="http://www.c-sharpcorner.com/UploadFile/raj1979/Caching12312007041352AM/Caching.aspx">http://www.c-sharpcorner.com/UploadFile/raj1979/Caching12312007041352AM/Caching.aspx</a><br /><a href="http://msdn2.microsoft.com/en-us/library/xsbfdd8c.aspx">http://msdn2.microsoft.com/en-us/library/xsbfdd8c.aspx</a><br /><a href="http://weblogs.asp.net/justin_rogers/archive/2004/10/23/246745.aspx">http://weblogs.asp.net/justin_rogers/archive/2004/10/23/246745.aspx</a><br /><br /><span style="font-size:80%; color:#cccccc;">Scott Tattersall is lead developer of stock alerts, stock charts, and market sentiment for <a href="http://www.zignals.com">Zignals</a></span></span></span><br /><a href="http://bp2.blogger.com/_WWGUfU1tOjI/SJG6hh05RzI/AAAAAAAAAS8/fcKbcofOeAI/s1600-h/feed+the+bull.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;" src="http://bp2.blogger.com/_WWGUfU1tOjI/SJG6hh05RzI/AAAAAAAAAS8/fcKbcofOeAI/s320/feed+the+bull.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5229165727532533554" /></a>Scotthttp://www.blogger.com/profile/14820723276683152547noreply@blogger.com2