Open Menu

Synergex Blog


Search Engine Optimization (SEO)

By Steve Ives, Posted on June 14, 2010 at 7:08 pm

I’ve been writing web applications for years, but I’ve never really had to put too much thought into whether search engines such as Google and Bing were finding the sites that I have worked on, or whether they were deducing appropriate information about those sites and giving them their appropriate ranking in search results. The reason for this is that most of the web development that I have been involved with has tended to be web “applications”, where the bulk of the interesting stuff is hidden away behind a login; without logging in there’s not much to look at, and in many cases the content of the site isn’t something you would want search engines to look at anyway … so who cares about SEO!

However, if you have a web site that promotes your company, products and services then you probably do care about SEO. Or if you don’t, you probably should! Improving your ranking with the search engines could have a positive impact on your overall business, and in these economic times we all need all the help we can get.

Turns out that the basic principles of SEO are pretty straight forward, really it’s mainly about making sure that your site conforms to certain standards, and doesn’t contain errors. Sounds simple right? You’d be surprised how few companies pay little, if any, attention to this potentially important subject, and suffer the consequences of doing so … probably without even realizing it.

You have little or no control over when search engine robots visit your site, so all you can do is try to ensure that everything that you publish on the web is up to scratch. Here are a few things that you can do to help improve your search engine ratings:

  • Ensure that the HTML that makes up your site is properly formatted. Robots take a dim view of improperly formatted HTML, and the more errors that are found, the lower your ratings are likely to be.
  • Don’t assume that HTML editors will always produce properly formatted HTML, because it’s not always the case!
  • Try to limit the physical size of each page. Robots have limits regarding the physical amount of data that they will search on any given page. After reading a certain amount of data from a page a robot may simply give up, and if there is important information at the bottom of a large page, it may never get indexed. Unfortunately these limits may be different from robot to robot, and are not published.
  • Ensure that every page has a title specified with the <TITLE> tag, and that the title is short and descriptive. Page titles are very important to search engine robots.
  • Use HTML headings carefully. Robots typically place a lot of importance on HTML heading tags, because it is assumed that the headings will give a good overall description of what the page is about. It is recommended that a page only has a single <H1> tag, and doesn’t make frequent use of subheadings (<H2>, <H3> etc.).
  • Use meta tags in each page. In particular use the meta keywords and meta description tags to describe what the page content is about, but also consider adding other meta tags like meta author and meta copyright. Search engine robots place high importance to the data in meta tags.
  • Don’t get too deep! Search engines have (undocumented) rules about how many levels deep they will go when indexing a site. If you have important content that is buried several levels down in your site it may never get indexed.
  • Avoid having multiple URL’s that point to the same content, especially if you have external links in to your site. How many external links point to your content is an important indicator of how relevant your site is considered to be by other sites, and having multiple URL’s pointing to the same content could dilute the search engine crawlers view of how relevant your content is to others.
  • Be careful how much use is made of technologies like Flash and Silverlight. If a site’s UI is comprised entirely of pages which make heavy use of these technologies then there will be lots of <OBJECT> tags in the site that point the browser to the Flash or Silverlight content, but mot much else! Robots don’t look at <OBJECT> tags, there’s no point because they would not know what do with the binary content anyway, so if you’re not careful you can create a very rich site that looks great in a browser … but has absolutely no content that a search engine robot can index!
  • If your pages do make a lot of use of technologies like Flash and Silverlight, consider using a <NOSCRIPT> tag to add content for search engine robots to index. The <NOSCRIPT> tag is used to hold content to display in browsers that don’t support JavaScript, but these days pretty much all browsers do. However, search engine robots DO NOT support JavaScript, so they WILL see the content in a <NOSCRIPT> section of a page!
  • Related to the previous item, avoid having content that is only available via the execution of JavaScript – the robots won’t execute any JavaScript code, so your valuable content may be hidden.
  • Try to get other web sites, particularly “popular” web sites, to have links to your content. Search engine robots consider inbound links to your site as a good indicator of the relevance and popularity of your content, and links from sites which themselves have high ratings are considered even more important.
  • Tell search engine robots what NOT to look at. If you have content that should not be indexed, for any reason, you can create a special file called robots.txt in the root folder of your site, and you can specify rules for what should be ignored by robots. In particular, make sure you exclude any binary content (images, videos, documents, PDF files, etc.) because these things are relatively large and may cause a robot to give up indexing your entire site! For more information about the robots.txt file refer to http://www.robotstxt.org.
  • Tell search engines what content they SHOULD look at by adding a sitemap.xml file to the root folder of your site. A sitemap.xml file contains information about the pages that you DO want search engine robots to process. For more information refer to http://www.sitemaps.org.
  • Ensure that you don’t host ANY malware on your site. Search engine robots are getting pretty good at identifying malware, and if they detect malware hosted on your site they are likely to not only give up processing the site, but also blacklist the site and never return.

Getting to grips with all of these things can be a real challenge, especially on larger sites, but there are tools out there to help. In particular I recently saw a demo of a new free tool from Microsoft called the SEO Toolkit. This is a simple application that you can use to analyze a web site in a similar way to the way that search engine robots look at a site, and the tool then produces detailed reports and suggestions as to what can be done to improve the SEO ratings for the site. You can also use the tool to compare changes over time, so you can whether changes you make to the site have improved or worsened your likely SEO rating. For more information refer to http://www.microsoft.com/web/spotlight/seo.aspx.

This article only scratches the surface of what is an extensive and complex subject, but hopefully armed with the basics you can at least be aware of some of the basic rules, and start to improve the ratings for your site.


Another TechEd Sticky Note

By synergexadmin, Posted on June 10, 2010 at 11:58 pm

The other night, I discovered the way to beat the heat while here in New Orleans. It’s a fruity little concoction known as the Hurricane, and while it doesn’t actually affect the climate around you, it sure makes feeling hot and sticky a lot more enjoyable. I’m also pretty sure how it got its namesake: in the morning, you find yourself trying to reconstruct the previous 12 hours of your life by putting together the pieces and fragments of your memory.

TechEd 2010 draws to a close this evening, and though it’s been increasingly difficult to find sessions that seem pertinent to we Synergexians, it’s still been a worthwhile experience.
I’ve learned a lot just by watching presenters step through the build of a Silverlight UI using Microsoft Expression, or show off the latest features of Visual Studio 2010 and how it can be used to quickly create a web app, or walk through the use of new simplified Windows Communication Foundation 4 features.I’ve even filled in the holes in my schedule with sessions on interesting (to me) topics, such as IPv6, trends in cybercrime, and hacker techniques.

Which all brings me to the point of this little blog entry: It seems to me that the value of conferences lies not in the number of sessions that directly apply to you, but in the quantity and quality of the little tidbits you pick up each day. It’s in the discussions you have with other developers and like-minded individuals – whether they take place while sitting down over a cup of coffee, or simply during a quick ride in the elevator. It’s in the creative ideas that spring up when you see a clever implementation and wonder if you can apply the same techniques to an unrelated solution of your own. It’s in the tips, tricks and techniques that you pick up, which will not only save you hours, days, and even weeks of effort in the year ahead, but which can also be shared with the rest of your team to make them more productive as well.

Just a sales pitch for SPC2010? Perhaps…but that wasn't the intent. After all, this is my blog, and with it I get to share helpful experiences from my time “out in the field.” If writing about it all means I’ll get to see more of you when we set up shop in October at the Citizen Hotel, then so much the better. But in the end, my little revelation about the value of coming to TechEd – even with so much focus on technologies that I can’t use – is helping me to sit back and enjoy this final day of the conference, secure in the knowledge that I’m going to be learning something interesting at every turn. And isn’t that what attending the conference is all about?

That, and the Hurricanes, of course…


Preparing for Windows Phone 7

By Steve Ives, Posted on at 7:59 pm

By Steve Ives, Senior Consultant, Synergex Professional Services Group

Windows Phone 7

Later this year, probably, Microsoft are releasing a new version of their phone operating system, and it’s going to be a BIG change for developers who have created applications for the earlier Windows Mobile operating systems. The new O/S is called “Windows Phone 7”, and although under the covers it’s really still Windows CE, on the surface things will look VERY different.

Perhaps the largest single change will be the user interface of Windows Phone 7 devices, which will be entirely driven by Microsoft Silverlight. That’s potentially great news for existing Silverlight or WPF developers, but will of course mean a total re-write of the UI for developers with existing applications which were essentially based on a subset of Windows Forms.

Using Silverlight will mean that we can expect some dazzling UI from applications, and indeed the O/S and the standard applications provided with it already look pretty cool, but there will definitely be a learning curve for anyone who has not developed Silverlight applications before.

Part of the good news is that the basic tools that you need to develop Windows Phone 7 applications are free. You can download Visual Studio Express Phone Edition and have pretty much what you need to develop applications. At the time of writing though, these tools are in a “pre-beta” form, and as such you can probably expect some issues, and need to update the tools pretty regularly.

There is, in my humble opinion at least, also some bad news, not least of which is that Microsoft seem to have turned the Windows Phone platform into, essentially, another iPhone! While developers can use free development tools (or full versions of Visual Studio) to create their applications (just like with the iPhone) they will have to sign up for a $99 annual “Windows Phone Developer “subscription in order to have the ability to deploy their application to their physical phone for testing (just like with the iPhone).

It will no longer be possible to deploy applications via “CAB file” installations, in fact for anything other than developer testing, the ONLY way to get an application onto a Windows 7 Phone will be via the Microsoft “Windows Phone Marketplace” (just like with the iPhone). When a developer publishes an application to the marketplace they can chose whether the application is free, or is to be charged for. With iPhone development developers can submit an unlimited number of free applications, and many do. With Windows Phone 7, developers can only submit five free applications, and after that there will be a charge to submit further free applications. If an application is submitted for sale, Microsoft will take a 30% cut of any proceeds (just like with the iPhone).

Applications submitted for inclusion in the marketplace will be subject to “testing and approval” by Microsoft (just like iPhone apps), and apps may be rejected if they don’t meet the guidelines set by Microsoft (just like with iPhone apps). This inevitably means that some types of applications won’t be allowed. For example, with the iPhone it is not possible (in the US at least) to use “tethering” to enable you to plug your iPhone into your laptop in order to access the Internet via the cell phone network, and I would imagine we’re now going to see similar restrictions on Windows 7 Phone applications.

iPhone applications execute in a very strictly defined sandbox, and while this does afford a lot of protection for the platform (because, for example, one application can in no way interact with the data of another application), it can also seriously limit what applications can do. For example, on the iPhone it is not possible to save an email attachment (say a PDF file) and subsequently open that PDF file in another application, Acrobat Reader for example. While I understand the protections offered by the sandbox approach, as a user of the device I feel that it restricts too far what I can do with the device. The Windows Phone 7 platform is essentially exactly the same.

Other restrictions in the Windows Phone 7 platform that developers will have to come to terms with are:

  • No access to TCP/IP sockets
  • No access to Bluetooth communication
  • No access to USB connections to a host computer
  • No Windows Forms UI’s
  • No SQL Express access
  • No ability to execute native code via pinvoke (except device drivers, which must be approved by Microsoft)
  • No customization of O/S features (e.g. no alternate phone dialers)

One thing that strikes me as kind of strange is that, apparently, the web browser on Windows Phone 7 will not support Flash, and apparently will not support Silverlight either! The flash thing is kind of expected, both Apple and Microsoft seem to do everything they can to keep Flash of THEIR devices, but not supporting Silverlight (on an O/S where the entire UI is Silverlight) was a surprise … at first. Then I realized that if the browser supported Silverlight there would be a way for developers to circumvent all of the application approval and marketplace restrictions that I talked about earlier!

Another surprise was that, like all versions of the iPhone until iOS 4.0, Windows Phone 7 devices will only execute a single user application at a time. This is one of the main things that iPhone users have complained about through the versions, and Apple just learned the lesson, but it seems that Microsoft have decided not to. For developers this means that it is imperative that applications save their data and state frequently, because the application could be terminated (with notification and the ability to clean up of course) at any time.

One thing is for sure … Microsoft seem to be betting the company on “The Cloud”, and Windows Phone 7 falls straight into this larger scale objective. The vision is that this new device will be a gateway to The Cloud in the palm of your hand. It is expected that many applications may execute directly from The Cloud (rather than being installed locally on the device) and that the device will have the ability to store (and synchronize) data in The Cloud. Apparently these features will be included for free, with a limited (not announced) amount of on-line storage, and presumably fee-based options for increasing the amount of storage available. Of course using things in The Cloud is all well and good, until youfind yourself in a "roaming" situation, paying $20/MB, or more!

On the bright side, Windows Phone 7 devices will be available from a number of different manufacturers, so there will be choice and competition in the marketplace. Windows Phone 7 devices will (in the US at least) be available from a number of cell phone carriers, unlike Apples exclusive deal with AT&T.

While there is no doubt that Windows Phone 7 promises to be a seriously cool new device, and I have no doubt will sell in larger numbers than any of the predecessor Windows Mobile devices ever did, it remains to be seen whether it will have what it takes to be a serious competitor to the mighty iPhone. I can’t help wishing that Microsoft had done at least some things a little bit differently.


A Sticky Note from TechEd 2010

By synergexadmin, Posted on June 9, 2010 at 12:00 am

So, I’m here at TechEd 2010 in the hot, muggy, all-around sticky town of New Orleans. I’m pretty sure that the person who decided that holding a summer conference in the bayou was a good idea is not here, as I’ve yet to hear of any lynchings.

Fortunately, the conference center is nice and cool (I’m sure the air conditioning bill is staggering), and the fact that I’m surrounded by thousands of techies – mostly of the male variety – is somehow less onerous when combined with the cool, climate-controlled breeze swirling about me.

TechEd is most certainly a Microsoft conference, and it can be difficult to find the right sessions to attend. Sure, we want to keep up on the latest and greatest uses of Microsoft technology, but only as they relate to the needs of Synergex’s customers. Learning all there is to know about SQL Azure, or figuring out how to take advantage of SharePoint SuperDuper Edition just isnt’ going to help many of us.

However, there’s been at least one session during every schedule slot which highlights some product, feature or design pattern that can assist Synergex customers who employ Microsoft technologies. Surprisingly, there have even been a few presentations that contained nuggets of good material that can be extended to some of our OpenVMS and Linux/Unix customers as well.

I’ll be following up in the days and weeks to come with some “Tech Tips” that will hopefully save some of you a headache or two. From diagnosing network problems that affect the performance of xf-enabled solutions (you’ve just gotta love what XP does to networks), to using Visual Studio 2010 to quickly set up a working CRUD application (which pretty much looks like it sounds, but at least it works!), to Silverlight desktop deployments (anyone for providing a Mac solution?), I’ll be trying to share some of the knowledge with those fortunate enough not to be trapped in the sauna known as the Big Easy.

Until next time!


The Cloud

By Richard Morris, Posted on May 19, 2010 at 6:00 pm

Star date 16 May, Consultants log 1. At three pm this afternoon I was arriving at Manchester airport expecting to catch a 40 minute flight over to Belfast for a relaxing evening before visiting customers on Monday morning. Six hours later I’m sat on the desk of a Norforkline ferry with a seven hour crossing ahead of me. Don’t get me wrong, I have nothing against ferries, but this one is predominantly filled with drunken football fans, screaming kids, and high octane, beer swilling, diesel fragranced truck drivers, and there is not a spare chair to be had. Camping out on the floor is my only option. First class travel!

We’re half way through our road trip of the UK – we being Bill Mooney and myself. The first half of the trip started off just peachy. Some great customer visits and some lovely British scenery to drive through – and drive we have. We have covered hundreds of miles in the last three days, but it’s a small cloud of dust that threatened to halt our travels. Did this daunting fact stop Bill in his tracks? I think not!

The volcano in Iceland has again decided to spew out more dust into our ever so fragile atmosphere. The resulting cloud has grounded flights around the UK yet again. So, sat in a local bar, every bit of modern computing and communications equipment to hand, we set about attempting to find a resolve to our predicament. No cloud was going to stop us from visiting our customers!

Several calls, web searches and dead-ends later we hatched the plan. The trouble was so did everyone else. We were not alone in our quest to reach the emerald island. With all planes grounded, and both of us not remembering our swimming trunks, our only option seemed to be to sail. Our options were limited – we needed to reach a port within the next couple of hours to have any chance of securing a berth. And so began our “great adventure”.

The drive over to Liverpool was rather eventless. On arrival at the wrong port (freight only) we were guided to the passenger terminal, some ten miles away. On arrival Bill made a dash for the check-in desk. Knowing Bill as I do, I parked up and hot-footed it over just in time. “Ya, that’ll do fine” and heard Bill say as he was handing over his credit card. “What will?” I inquired. “It’s OK, they have room for us.” On further interrogation is transpired that “room” actually meant “any space you can find on the floor”. There were no cabins left, and the boat was completely full. “Full” actually means “more people than the facilities on the boat can handle”. But the customer always comes first, I thought to myself. Tickets booked, car loaded onto the ferry, and here I am, sat on the floor wondering what sleep I may actually get, knowing I have a two plus hour drive to Belfast ahead of me – at five the following morning. And the prospect of a shower a distant dream.

Then I started to notice a smartly dressed gentleman winking at me? Not sure of his intentions and cornered (did I forget to mention Bill had left me to watch over our bags while he assembled his tripod and camera and set off around the ship to “capture the moment”?) I was concerned, to say the least. Should I try to carry all the bags and run? “Hey son”, shouted the winking man. “We’ve got you a cabin!” It turns out the very nice gentlemen had a slight affliction in his right eye, but how was I to know that? The news was greeted with much joy, “And one for Bill?” I inquired. “You’ll be lucky, you’re sharing!” Now I’ve known Bill for many years, but sharing a small cabin on a rocking ferry has never been very high on my bucket list. But, needs must, and the customer always come first, I thought.

Very little sleep (due to a snoring Bill) was rudely interrupted by the crew banging on the cabin door at four am. “Breakfast is served”. Still, we had a cabin, and it had a shower! OK, let’s clarify – a gently dripping faucet and a shower curtain that gets blown about and actually wraps around you as you attempt to wash. Not the best start to an early morning. Breakfast was nice, or at least it looked nice. I’d no sooner sat down with my full English when the PA announced “would all car passengers please return to their cars immediately”.

We were soon off the boat, and heading up the motorway towards Belfast. Both customer visits were a great success, and to be honest well worth the effort. We were booked on the return ferry from Dublin to Liverpool on Monday evening. Did we make it……


Challenges Facing Synergy Developers

By synergexadmin, Posted on at 4:09 pm

I was recently involved in a discussion concerning Synergy tools and technologies, and how our customers can best position themselves to take advantage of them. Somewhere along the way, I was asked for (or I volunteered – I don’t exactly remember) my opinion about the current software development landscape, and the challenges that face so many of our customers today.

I identified three areas of concern for our customer base and the future of their applications. And since you’re reading this, I assume that you are a Synergy customer, so hopefully what I had to say will strike a chord with you – either as a challenge you’ve already faced and overcome, or as an issue with which you’re currently grappling.

GUI is King

Yes, you’ve heard it a million times before, but unfortunately you’re going to keep hearing about it until you’ve updated the look and feel of your application. The continued survival of almost every solution boils down to implementing one of two main GUI choices: web technologies or Windows .NET (which means WPF, SilverLight or WinForms).

If you were reading that last part carefully, you’ll note that this discussion is not targeted solely at the *nix and OpenVMS operations; if you’re on Windows already, but are still using the same UI Toolkit displays that your app was using years ago (i.e., you’re not incorporating .NET technologies with the Synergy/DE .NET API), then you’re in the same boat as everyone else.

Arguments about the efficiency of character-based data entry are still valid, but they’re becoming less and less relevant (or realistic) – particularly if you’re using the speed of data entry as an excuse to allow the rest of your application to wallow. While you may have several “workhorse” data entry screens, chances are they make up a very small percentage of your overall application. Information displays, reports, “look-ups” or maintenance utilities have no reason to be tied down to a cell-based presentation.

GUI’s are simply too pervasive and too familiar to ignore, and purely cell-based software solutions cannot hope to compete for much longer. Even the most enthusiastic supporters of your application, the ones who can see beyond the green screen or the pseudo-Windows of Toolkit to the power of your application, will readily agree that it’s becoming more and more difficult to convince new hires (or prospective customers) of the superiority of your Synergy solution.

And I’m not just talking to the Synergy shops that develop vertical applications which are then distributed and sold; Synergy “end users,” the shops that have an in-house, custom-built Synergy solution, need to take note as well. Serious scrutiny of your Synergy application is coming with the next management or executive-level turnover.

Perception is Reality

We’ve all heard the term before, but I think it would be better stated as “People act on perception as if it is reality.”

One of the most common (mis)perceptions in today’s society is “newer is better.” It’s beat it into to us at every turn and in every advertisement, from New and Improved Acme glass cleaner to Next Best Thing flat panel televisions. And unfortunately, a cell-based UI isn’t doing you any favors in this department. Cell-based apps are generally looked at as “old,” “outdated,” “legacy,” or simply (my favorite) “DOS.” It’s this perception that gives rise to a host of other concerns and questions. Interoperability. Reliability. Power.

But perhaps the most dangerous questions are the ones that aren’t asked. If the users of your application generally see it as a throwback to a bygone era, then it’s possible – nay, likely – that it never even occurs to them to ask the right questions. Can it communicate with Brand X’s software? Can it display graphs that give a visual indication of the state of your sales? Does it offer a web service? Does it support ad hoc reporting? These questions are the sales points from competing software vendors, and rest assured that management has heard the pitches – whether it be the CEO at your customer’s company, or the management team of your own.

Remember: Your UI is the gateway to the power of your application. If it looks like something that was distributed on a 5 &frac14;” floppy, then it’s time to seriously rethink the front end of your application.
Now is the Time to Act

The current recession has made life difficult for almost everyone. Sales are down, money is tight, and thoughts of corporate expansion have been put on hold.

For the past two years or so, companies simply don’t have the money to invest in a brand new software solution, whether it be an ERP package to replace your in-house software, or a glossy, glitzy, Windows-and-SQL-based solution from your competitor. If you’re wise, you’ll look at this period as a reprieve from increasingly heavy competition, and as an opportunity to throw development efforts into high gear.

Use the time to ensure that your software has every last competitive edge when the wheels of the economy start moving again. Take advantage of the lull in sales and expansion, and incorporate the latest and greatest that Synergy has to offer. Put a GUI on your application – even if it needs to be one screen at a time. Use the Synergy.NET API to enhance the look and feel of the most commonly-used screens of your Toolkit application. Make the small investment in Synergy SQL Connect, and set up a data warehouse that can be used by SQL Reporting Services, or open up the data in your system to other commonly-used software by utilizing Synergy xfODBC. Create a web portal, add a web service or two, and take advantage of the APIs and web services of other software solutions to enhance your own.

In Conclusion…

I want it known for the record that I’m a huge fan of Unix/Linux, and absolutely love OpenVMS, so understand that I’m not advocating a switch to the Windows OS. Remember that you can still take advantage of the reliability and stability of both *nix and VMS backend systems, and just bolt on a Windows GUI client wherever appropriate. Heck, chances are that your users are already using Windows boxes running VT emulator software anyways, so there’s probably no hardware investment that need concern you.

And there are plenty of methods available to leverage your core systems and routines if you’re one of the many Synergy shops currently running a “green screen” solution. Investigate xfODBC, SQL Connect, xfServer and xfServerPlus. Install and play around with the (free) Synergy Data Provider for .NET, and get yourself acquainted with Windows- or web-based technologies, programming languages and development environments. Start cleaning up your code and separate that business logic from the UI components – it may not be as hard as you think.

A little research and a small investment in time will go a long way toward illuminating the path of a better, more robust and more marketable software solution.

Do you agree? Disagree? Have a story that’s relevant, or want to share your own challenges and solutions to the GUI problems or perceptions you’ve faced? Let us know about them, and let’s get the discussion started.


Counting on Synergy

By Richard Morris, Posted on May 16, 2010 at 9:13 pm

One of our customers, Ibcos Computers, has used Synergy/DE to develop a web server. It may sound a bit bland, but it’s actually quite cool. In simple terms they have a server program running, written in Synergy, which accepts inbound resource requests. Each request then causes a child processes (again in Synergy) to be created to honour the request and return the required data. That data may be a web page, an image, or data as a result of executing some existing Synergy logic from their application.

Well, so far so good. The web server application works a treat. They can host complete web sites, and execute tried and tested logic from their Synergy application accessing their Synergy DBMS data. However, the way the Synergy/DE licensing works means that every time a child processes is created it consumes a Synergy license. This is a very valid situation – you are running a Synergy process, so you consume a Synergy license. Although you are running a Synergy process, that process may simply be locating and returning a resource associated with a web page, for example an image. When a web browser is loading a page from a web server it will asynchronously request many page resources at the same time. This means for a single page the web server may have many child processes executing, so the page can load quicker. It can be argued that although there are a number of live processes, each consuming a Synergy license, there is actually only one user, or client, which is the web browser.

Ibcos also have additional execution models for their web server. They have client applications that “log-in” to the site, execute Synergy logic to process data, then “log-out”. For this model, each connection does, and should, consume a Synergy runtime.

The challenge here is to reliably and accurately count the number of users actually accessing the web server. From a licence enforcement aspect, each resource is allocating a Synergy runtime license. However, Ibcos require a license entitlement model that would allow them to monitor individual types of connections, and be able to identify when pools of connections actually relate to an individual user.

For a solution Ibcos looked towards Synergex. I wrote a specification and they commissioned us to write the required code to utilise the Synergy/DE Licensing Toolkit to monitor license usage. Ibcos can use the Synergy/DE Licensing Toolkit to create and register licensed products within the Synergy/DE License Manager. As the different user types access the Web server, license slots are allocated and released, as required, around the logic being processed. A background process has been written that monitors license usage every five seconds. This information is stored and a program to report on license usage levels has been created. Synergex can then utilise this information to assist Ibcos with their license entitlement requirements.

Ibcos Computers, based in Poole, England, provides software solutions, written in Synergy, to the Agricultural, Ground Care, Construction, Material Handling, Commercial Vehicle and Industrial Dealer markets. Ibcos Gold, their main Synergy/DE UI Toolkit based application, has an installed base of over 500 customers and utilise the latest version 9.3 capabilities.

If you would like more details about License Enforcement and Entitlement, and the synergy Licensing Toolkit you can comment against this blog, email me at richard.morris@synergex.com, or visit the resource pages at www.synergyde.com.


On The Conference Trail

By Richard Morris, Posted on March 28, 2010 at 9:45 pm

Can you believe it is four months since I posted my first blog from Microsoft TechED in Berlin?  Time certainly does fly by, and so does the changing world of software development.  At Synergex we are committed to ensuring that we keep abreast of advancing technologies, so that we can continue providing expert advice and direction to Synergy/DE developers. 

This week I’m in London, and for the start of the week I can actually say “sunny” London.  Considering the weather we have been having here recently, I can honestly say spring may just be on its way!  I’m in London attending DevWeek 2010.  A software developer’s conference sponsored by, but not presented by, Microsoft.  I say that to emphasise that this is not another TechED.  This conference focuses more on the people using the tools and technologies, the real people like you and me, and not just the products on offer.

DevWeek 2010 is a lot smaller, in attendee numbers, than the TechED conference.  There are hundreds of people here, not thousands.  It’s a lot more personal, and you get much more opportunity to “discuss” (read “ague my case”) with the presenters.  The first session I attended on Monday morning had twelve people in the room, so it’s a lot more intimate.  The content, however, is far from small scale, with some impressive solutions to today’s application requirements.

Monday was a pre-conference day and I spent it delving deep into Silverlight, Microsoft’s XAML/WPF based Web development and application deployment environment.  You can achieve many cool things very easily with just a line or two of code.  Did you know, using Silverlight, it’s possible to deploy your application to appear to run natively on an Apple MAC?

One thing this conference has again emphasised to me is the abstract between software development and design.  And by design I mean the user interface.  All us programmers believe we can “cut it” when it comes to UI development, and many of us can do a reasonable job.  But put today’s design tools into the hands of the UI designers and it’s amazing what they can produce.  And the ease at which you can then implement their “styles” into you application is astonishing!  My first “hello world” foray into UI design was to produce a form with a round ball like object on it, and of course the customary “Press Me” button.  Pressing the button caused the ball to drop from the top of the window to the bottom, and then bounce a couple of times before coming to a rest.  Using Microsoft’s Expression Blend made it far too easy.  It really is a simple case of draw the object (my ball), define the result (ball at bottom of window) and pick a transition (how it gets from top to bottom).  Add my button, and a couple of clicks later, it really does look like the ball is naturally falling (starts slow and speeds up) and then bounces a little as it comes to its final resting place.  And I have not written a single line of code!

OK, So I can hear you thinking “and the use of a bouncing ball is?”  Good question.  Not much, I agree, but the techniques can easily add such a visual edge to you application.  Consider if you created a button style, which you could apply, as a default, to all buttons within your application.  The style could, for example, make the button double in size, including the text, when you hover over it with the mouse, making it easier to read.  Maybe have a circular ripple effect when you press it.  What if the input control (field) on your form that had focus is automatically magnified, giving the user a clearer view of what’s being asked for, without obscuring other areas of the form.   Your options are endless, and from a software developers standpoint – no code changes.  The code to data bind you fields to your Synergy data remains the same, but the look can be infinitely customisable.

Ensuring we structure our UI design correctly, data bind our controls in the XAML and avoid all non-UI related code behind ensures that we can utilise these UI development tools to the full.  We’ll be presenting these cool techniques at this year’s SPC in October, so make sure you’ve booked your place!


How Did I Get Here?

By Richard Morris, Posted on March 22, 2010 at 9:41 pm

Well, that was the DevWeek that was.  As I mentioned in my last blog, it’s been my first time here, and it’s been a very informative and interesting conference.  Will I attend next year?  I’d like to think so.  But that decision is not mine to be made.  As many of you reading this know, it’s “Manny” who makes the decisions.

So how does he (or she for that matter – I too had a lady boss once) go about making such a decision?  Is it a case of “If I mention it enough, I’ll get to go?”  I’m not sure that one works, and it’s certainly not going to be too reliable!  I could try the tact of “if I request to go to everything I must get to go to something”, but information overload just takes up too much of your managers time.  My boss certainly doesn’t search the web and peruse the small ad’s looking for events to send me on, far from it!  He’s got much better things to do, like figuring out just how he can squeeze my two new, 23 inch, flat panel, high resolution, wide screen monitors that I’ve just ordered, into this month’s budget.

But I need to ensure I keep up to date with modern technologies and techniques.  Reading web articles, blogs and industry publications helps, and one of the biggest benefits of my job is working with such a diverse user base – you all have different and unique requirements, but can apply very similar solutions using Synergy and all its related capabilities.

So, my boss relies on me to investigate and review all the available options.  I must admit, although Synergex has a single office in California, it certainly has a global attitude to training and employee development – I’ve been to conference and training events in the US, and across Europe.

The factors which influence my decisions as to which conference or training event to attend come directly from the requirements from our user base, and the direction in which Synergy/DE is going.  There is little point it being an expert in a technology that is of little or no use to our users.

I learnt a lot this week, both by attending some really good sessions, and also by talking to presenters and other attendees.  My background is significantly different from many (ok all!) of the people I spoke to.  They live, breath and sleep Microsoft and .NET.  I don’t, but need to understand how best we can encapsulate these new technologies within Synergy/DE applications.

Now, what are your justifications for attending this year’s SPC in October?  Firstly, as soon as the agenda is published, review it and ask yourself “what in the agenda will help me to advance my application?” I bet you can find at least ten things.  If your application runs on Windows and you plan on sprucing up your UI, there are a number of sessions that will walk you through the process of implementing the latest super slick WPF interface.  And remember, all source code will be made available on Code Exchange.  Not running on Windows?  It’s lucky you have Synergy/DE then!  Many of this year’s sessions focus on non-Windows environments, and the latest OO capabilities within the language.  Beginning to implement simple OO practices within your application will greatly improve the quality of your code, and the speed at which you can develop it.  And it’s not difficult to get started using synergy OO concepts, and it can integrate seamlessly with your existing procedural code!

How many of you UNIX and OpenVMS developers have a PC on your desk, and use a telnet terminal emulator to edit and build your code?  You’ve seen Workbench, know just how powerful a development environment it is, so why not swap the emulator for Workbench? We’re developing a session that will show you just how easy it is to edit then remote build your source code from within Workbench.

Arm yourself with the facts.  Talk to “Manny”, and book your ticket to this year’s SPC!

One of the justifications I have for attending these conferences is that I can impart my new found knowledge and understating, through the SPC, to our customers – you!


Now that’s what I call service!

By Richard Morris, Posted on February 23, 2010 at 6:11 pm

Web services have been around for quite a while now.  If you attended the Success Partner Conference back in 2006, in London or Boston, you’ll have completed a hands-on tutorial that walked you through the steps required to consume web services with Synergy code.

During the conference, we wrote Synergy code to consume a number of different web services, from a number of different hosting sites.  The first service we utilised returned an array of valid country names. We then used the returned information to populate a UI Toolkit combo drop-down input control to allow the user to select the required country.  Other services we used allowed us to validate address details and credit card numbers.  The final web service we worked with, that I know several customers have since made use of, gave us the ability to interface with a Simple Message Service (SMS) API and send text messages to a mobile phone number.  Not all web services are free, and if I remember correctly, I believe the cost for sending an SMS was about $0.008!

So, when I saw the request from Rodney Latham on the Synergy-l list server, requesting if anyone had code to determine if a given date was a public holiday, I thought “there must be a web service to determine that”.  After a quick search on the Internet I found one at a site called www.holidaywebservice.com. After a few minutes of navigating the site, and looking at the WSDL (Web Service Description Language), it was obvious that I could write a few lines of Synergy code to consume the web service to determine if a date was a public holiday.  Another great feature of the web service was that it was free!

I knew that Rodney’s Synergy software ran on the OpenVMS operating system, so any fancy MS Windows coding was to be avoided – it had to be pure Synergy.  As long as the machine could access the internet we could execute the web service.

I could have written a regular Synergy subroutine or function, but instead I chose to implement executing the web service within a static method, as part of a SynPSG.DateFunctions class.  By defining the method as static, you don’t need to instantiate an instance of it.  This makes the coding very simple.  Add a reference in your code, using the “import” statement, to the SynPSG namespace and you’re ready.  To determine if the date is a holiday, simply execute the static method, for example;

import SynPSG

proc

    if (SynPSG.DateFunctions.IsHoliday(20100105,

&     SynPSG.CountryCode.EnglandAndWales)

    begin

    ;;we have a holiday, let’s party!

    end

The first argument to the IsHoliday() method accepts a reversed eight digit date and the second is an enumeration which allows you to select the required country code.  The enumeration is defined within the SynPSG namespace;

public enum CountryCode

    Ireland

    EnglandAndWales

    Scotland

    UnitedStates

endenum

Simple, free and the code is available on the Synergex Code Exchange – log into the resource centre and search the Code Exchange directory for IsHoliday.


Using Workbench to Build Applications on Remote Servers

By William Hawkins, Posted on February 18, 2010 at 4:21 pm

I recently went to a customer site, to help them integrate Workbench into their OpenVMS development environment.  As a source code editor, the “integration” is relatively simple, you just need to have NFS/CIFS/SAMBA installed, and use it to make your OpenVMS (or UNIX) drives look like they’re actually Windows drives.  However, when you want to compile or link your application, you either need to go back to your telnet window and build it there, or you can download the RemoteBuild utility from the SynergyDE CodeExchange.  There are two versions for OpenVMS out there right now, both provided by Chris Blundell from United Natural Foods Inc.

Synergex PSG decided that we wanted to provide a remote build facility that could talk to multiple environments using information from a single Workbench project.  We wanted to minimize any potential firewall issues (for companies that have internal firewalls), and we also wanted to build on the SynPSG.System classes (distributed with ChronoTrack – our SPC 2009 demonstration application) for the network communication.  For those of you unfamiliar with the SynPSG.System classes, they are a partial implementation of the Microsoft System classes written in Synergy, (so they work on all supported platforms,) but when we have Synergy for .Net available, we'll be able to use the native .NET framework System class without modifying code. (ok, we'll have to change the import statements, but that should be all.)

So PSG has posted our flavor of a remote building application into CodeExchange – it's called remoteServer.   There is a client component to install into Workbench, and a server component (written in Synergy) that runs on each development server.  If you have both SAMBA (or equivalent) and remoteServer installed and configured, you are able to compile your application on the remote system, and in the unlikely event of a compile error (I know, you never have any coding errors) you will be able to double click on the error on the output window, and go straight to that line of code in the source file on your remote server.

If you work in a multi-platform development environment, I would encourage you to go download any of the remote build offerings in CodeExchange, and start using the power of Workbench to help improve the productivity of your developers.


To Print or not To Print

By Richard Morris, Posted on February 12, 2010 at 10:10 pm

The Synergy Windows Printing API is a collection of routines that allow you to fully control printing on the Windows platform, and make use of extended printer features.  The API records the print information in a Windows enhanced metafile which can then be played back to devices such as a printer or the print preview window.

Many of you will be using this API to bring true Windows printing capabilities to you applications. So, when you upgraded to version 9.3 of Synergy and rebuilt your applications, where did your prints disappear to?  You may have been caught out by a new feature added to the API.  In version 9.3 two new integer fields were introduced to the font specifications structure to enable you to change the orientation and escapement of the font.  The orientation allows you to specify the angle between the baseline of a character and the page’s horizontal axis.  The escapement specifies the angle between the baseline of a string of text and the page’s horizontal axis.  For full details see the version 9.3 on-line manuals.  The default value for both of these two new fields is zero, meaning that no rotation of the text will occur.

To set the required font characteristics you use the DWP_FONT sub-function of the WPR_SETDEVICE() function.  The font characteristics are defined within a structure called “font_specs”, included in the DBLDIR:winprint.def header file.  This structure is used to allocate memory to store the font specifications, and provide them to the %WPR_SETDEVICE() function.  You allocate the required memory using the %MEM_PROC() function, for example;

    fontHandle = %mem_proc(DM_ALLOC+DM_STATIC, ^size(font_specs))

You can then set the required font details;

    ^m(font_specs.face_name, fontHandle) = “Courier”

However, if you don’t specify values for all of the fields defined within the “font_specs” structure, the values will be undefined, and for the integer fields, most likely be non-zero.  So, the two new fields will actually contain values, and so orientation and escapement settings will be passed to the %WPR_SETDEVICE() function.  Things will no longer print as you expect!

One way to ensure that the integer data is initialised when you allocate dynamic memory is to use the DM_NULL qualifier.  This ensures that any integer data is initialised correctly.  For example;

    fontHandle = %mem_proc(DM_ALLOC+DM_STATIC+DM_NULL, ^size(font_specs))

However, this does not correctly initialise any non-integer data, and is not future-proof.  If the structure is modified to include new alpha/decimal/implied decimal fields in the future, these fields would then contain incorrect values.  An alternative is to initialise a local copy of the font specifications structure and then assign that to the allocated dynamic memory.  The INIT statement ensures that fields within a record or structure are initialised correctly based on the field type.  Firstly, create a structfield, which is a field defined as a structure type.

.ifdef DBLV9

record

    tmpFont    ,font_specs

endrecord

.endc

The code to allocate the memory for the font specification remains the same;

    fontHandle = %mem_proc(DM_ALLOC+DM_STATIC+DM_NULL, ^size(font_specs))

Now we use the structfield to correctly initialise the dynamic memory;

.ifdev DBLV9

init tmpFont    ;ensures individual fields correctly initialised

^m(font_specs, fontHandle) = tmpFont

.endc

This same coding structure can be applied to the other structure specifications defined in the DBLDIR:winprint.def header file.  The code is backward compatible with earlier versions of Synergy, and will prevent any similar issues in the future.

Alternatively, from version 9.1 you can remove the need to allocate memory, and simply use the structfield.  For example;

record

    textFont    ,font_specs

endrecord

And then use the structfield to define font characteristics;

    init textFont    ;ensures individual fields are correctly initialised

    textFont.facename = “Courier”

    textFont.weight = 700

    wpr_setdevice(rptHandle, DWP_FONT, textFont)

    wpr_pint(rptHandle, DWP_WRITEOUT, x, y, “Hello Bloggers!”)

This second approach has two advantages.  Firstly, you no longer need to allocate and clean up any dynamic memory.  The second is that you get full intellisense within workbench, listing the available fields within the font structfield.


What’s in my library?

By William Hawkins, Posted on February 4, 2010 at 4:22 pm

The obvious answer that springs to mind is "books", but some may respond "what sort of library?".  Of course, in this context, I'm really referring to a library containing Synergy object code. 

When referring to Synergy subroutines and functions, on both Windows & Unix, you can perform a "listdbo" or "dblibr -t" on the object file/object library and peruse the output to see the names of your routines.  However, when referring to methods (in classes/namespaces), the name used is mangled.  This mangling process takes the fully qualified name of the method, the return type, and all the parameter types, and reduces it down to a mangled name.  In a lot of cases, you can look at the mangled name and stand a chance of actually recognizing the name of the routine, but decoding the parameters in your head may require the use of illegal drugs.

For example, if you see a mangled name of '7SYNPSG5CORE11UTILITIES3CC6SYNCC11GETCCNAME_O7SYSTEM7STRINGI', you could intuitively see that it's probably this routine: 'SYNPSG.CORE.UTILITIES.CC.SYNCC.GETCCNAME(I)@SYSTEM.STRING'. 

But what about this one: '7SYNPSG5CORE11UTILITIES3CC6SYNCC11GETCCNAME_O7SYSTEM7STRINGSP0P1P2P39CARDTYPE'?  It’s the "same" overloaded routine, but it has a SYNPSG.CORE.UTILITIES.CC.CARDTYPE parameter instead of an integer parameter.  Similarly, if you saw '7SYNPSG5CORE11UTILITIES8WORKING9SHOWFORM_XO7SYSTEM7STRING', you could probably see that it's 'SYNPSG.CORE.UTILITIES.WORKING.SHOWFORM(@SYSTEM.STRING)'.  Now, what if you saw this '7SYNPSGCR1UTLTESWRKNGSHW9HTGB13'?   Well, it's surprisingly the same SHOWFORM method, but it's been mangled beyond recognition.  The term used here at Synergex is "crushed".  In fact, if you have a crushed name, it's basically impossible to determine the original method name. If the mangled name of the method is too long for the environment, the mangled name is crushed down to the maximum size permissible.  OpenVMS has a limit of 31 characters, 64-bit systems have a 188 character limit, and 32-bit systems have a 200 character limit. Actually, the limit is one character less, because we use the rule that if the name is exactly the maximum size, it must be a crushed name.  As the last 7 characters of a crushed name is a checksum, on OpenVMS you’re only left with 24 characters for a human “readable” name.

So how, exactly, does a mangled name become a crushed name? Well, characters are removed, one by one, until the name is exactly the correct length. First non-alphanumeric characters are removed, then vowels, then letters from the name of your first born child, then random letters based on the cycle of the moon, until you eventually get a name that fits. So, with only 31 characters for method ames, the OpenVMS users out there will have to become accustomed to seeing crushed (i.e. indecipherable) method names inside Shared Image Libraries.  If you need to create an OpenVMS Shared image library, I would recommend creating an object library, and using the make_share.com file to convert to a shared image library.  However, you may need to review the use of the MATCH qualifier on your shared image libraries, as method names can change with the modification of a parameter (or return) type. So changing a method to (for example) have an additional optional parameter will cause a new method name to be created.  Unless you rebuild your application to see (and use) the new name, you could find that the application starts giving “routine not found” errors.

You might think that not knowing the name of the routine would be a problem – it's not really, because the compiler has a consistent supply of the same high quality drugs, and given a constant method signature, will always generate the same crushed name.  So it really doesn't care that that your code said  "object.showform(1)", because it'll know that you really want to call the method '7SYNPSGCR1UTLTESWRKNGSHW9HTGB13' from your library.

For most developers out there, the actual name of the method inside a library is unimportant, but I though the more curious of you out there would be interested in this.


Picture This!

By Richard Morris, Posted on January 28, 2010 at 11:29 pm

In days of old, carrying your trusted, heavy weight camera around your neck, you’d take the perfect snap.  You’d then continue snapping away until the film was full, which, for the impatient among us, meant taking a large number of “I was just finishing off the film darling” type shots. On returning home you’d quickly rewind the film back into its cartridge, unless you had one of those fancy modern “advanced” film cameras of course, where the camera did it for you, and then pop it into your “postage free” envelope.  And off you sent it, in the hope that your “once in a life time” shot would be processed and returned to you post haste.

And what of the results?  Normally a hazy, slightly out of focus, batch of glossy pictures that really don’t give your artistic prowess justice.  After all, the subject must have moved because you’d never intentionally crop the top of the head of the bride, just above the eye line, and why was the gentleman third from the right picking his nose?

If this sounds like your photography experiences, the chances are if you live in the UK the people processing your film were a company called Harrier L L C.  You may know them better as “TRUPRINT”.  At their peak they were processing 85,000 rolls of film per day!  Today, however, they are processing no more than 1,000 films.  Not really a great statistic if film processing is your business.  But Harrier saw the potential of the digital world and has embraced the processing of the digital image.  Today they average over 200,000 prints a day, which can rise to over 1,000,000 prints at times like Christmas.  However, although this figure is significantly lower than the prints they were processing in the heyday of film, it’s now a very small part of their product portfolio.  The key to success was to diversify.  Today, in this digital age, people want more than just a glossy print of their blurred, half-cropped pictures.  They want the t-shirt, a coffee mug and of course the family calendar, all donned with their own artistic compositions.  Today, with a few clicks of a mouse you can upload your pictures and have them delivered to your door on anything from coffee mugs, placemats to a full size framed canvases.  You can even have your prized picture delivered to you in lots of tiny pieces – in the form of a jigsaw!

So where does Synergy fit into their IT strategy?  Their OpenVMS based Synergy /DE applications manage the order processing and management of every item they process.  Once an order is accepted through the many portals including a host of web sites, major supermarket chains, leading pharmacies, and of course by post, the Synergy application takes control.  It manages the processing or the required prints, storybooks or mugs (to name but a few product lines – they have over 500) through to despatch and successful delivery to the customer.  The Synergex Professional Services Group is assisting Harrier to evaluate to work needed to migrate their Synergy/DE applications from the OpenVMS platform to Microsoft Windows.


Web Browser “Session Merging”

By Steve Ives, Posted on December 8, 2009 at 5:11 pm

I just realized something about modern web browsers, and as a long time web developer it kind of took me by surprise! Maybe it shouldn’t have, maybe I should have figured this out a long time ago, but I didn’t.

What I realized is that Internet Explorer 8 shares cookies between multiple tabs that are open to the same web application. In fact it’s worse than that … it also shares those cookies between tabs in multiple instances of the browser! And as if that’s not bad enough, research shows that Firefox, Google Chrome and Apple’s Safari all do the same thing! Internet Explorer 7 on the other hand shares cookies between multiple tabs, but not between browser windows.

If you’re an ASP[.NET] developer you’ve probably figured out by now why I am so concerned, but if you’re not then I’ll try to explain.

ASP, ASP.NET, and in fact most other server-side web development platforms (including Java’s JSP) have the concept of a “current user session”. This is essentially a “context” inside the web server which represents the current users “state”. The easiest way to think about this is to picture a UNIX or OpenVMS system; when a user logs in a new process is created, and (usually) the process goes away when the user logs out. A web applications user session is not a process as such, but it sometimes helps to think of it that way.

Web developers in these environments can, and very often do make use of this current user session. They use it to store state; information about the current user, or application information about what they are doing or have done, and possibly even to cache certain application data or resources to avoid having to repeatedly allocate and free those resources, or to avoid having to obtain a piece of data over and over again when the data doesn’t change.

Now, at this point I want to make it clear that I’m not saying this is a good thing to do, or a bad thing to do. Some would say it’s a bad thing to do because it increases server-side resource utilization and hence impacts the scalability of the application; and they would be correct. Others would say that by caching data or resources it is possible to avoid repeated round-trips to a database, or to an xfServerPlus service, and hence helps improve runtime performance; and they would also be correct. As with many things in software development … it’s a trade-off. Better runtime performance and a little easier to code, but at the cost of lower scalability.

In reality, unless a web application needs to routinely deal with large numbers of concurrent users, the scalability issue isn’t that important. As a result, good practice or not, many web developers are able to enjoy the luxury of using current user session state without significantly impacting anything … and they do!

So … what’s the problem?

Well, the problem is that the web (HTTP) is a connectionless environment. When a user types a URI into a web browser the browser connects to the web server, requests the resource(s) identified by the URI, and then disconnects. In order to have the concept of a user “session” the web server needs to have a way of recognizing that a subsequent “request” is coming from a browser that has already used the application, and is attempting to continue to use the application. The way that web applications usually do this is to send a “cookie” containing a unique “session ID” to the browser, and the nature of HTTP cookies is that if this happens, the same cookie will be returned to the web server during subsequent connections. Web applications can then detect this cookie, extract the session ID, and re-associate the browser with their existing “current user session”.

This is how most server-side web applications work; this is how they make it possible to have the concept of an on-going user session, despite of the fact that the interaction with the browser is actually just a set of totally unrelated requests.

So now the problem may be becoming clear. Early web browsers would keep these “session cookies” private to a single instance of a browser. New browsers would receive and return different cookies, so if a user opened two browsers and logged in to the same web application twice, the web server would recognize them as two separate “logins”, would allocate two separate “current user sessions” and what the user did in one browser window would be totally separate from what they did in the other.

But … now that modern browsers share these session ID cookies between multiple tabs in the browser, and more recently between multiple instances of the browser window its self, server-side web applications can no longer rely on “session id” to identify unique instances of the client application!

At least that is the case if the application relies on HTTP cookies for the persistence of session ID. In ASP.NET there is a workaround for the problem … but it’s not pretty. It is possible to have the session ID transmitted to and from the browser via the URI. This would work around the multiple instance problems, but has all kinds of other implications, because now the session ID is part of the URI in the browser, and affects the ability to create bookmarks etc.

This whole thing is potentially a huge problem for a lot of server-side web applications. In fact, if your web applications do rely on session state in this way, and have ever encountered “weird random issues”, this could very well explain why!

So what’s the answer for existing applications … well, I honestly don’t know what to tell you. Passing session ID around via the URI may be an easy fix for you, but may not! If it’s not then the only solution I can currently offer is … don’t use session state; but if you use session state, then transitioning to not using it is probably a very large task!

By the way, today I read that the proposed HTML5 standard includes a solution to this very issue. Apparently it’s called “Session Storage”. Great news! The problem is that, according to some Microsoft engineers who are definitely “in the know”, it is very possible that the HTML5 standard may not be fully ratified until the year 2022! Seriously!


Page 10 of 13« First...89101112...Last »
Don't miss a post!

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts Tag Cloud Archives