Open Menu

Synergex Blog


HTTP API Enhancements in DBL 10.1

By Steve Ives, Posted on January 14, 2013 at 11:24 pm

In addition to introducing several totally new features DBL 10.1 also includes enhancements to the client portion of the HTTP API. These enhancements make the API significantly easier to use, and also make it possible to achieve things that were not previously possible.

Since the HTTP API was introduced in DBL 7.5 the client part of the API consisted of two routines. These routines are HTTP_CLIENT_GET and HTTP_CLIENT_POST. As suggested by their names these routines allowed you to issue GET and POST requests to an HTTP server. A GET request is a simple request to a server in which a URI is sent to the server and a response (which may include data) comes back. A POST request is slightly different in that in addition to the URI, additional data may also be sent to the server in the body of the HTTP request.

When dealing with an HTTP server it isn’t always possible to pre-determine the amount of data to be sent to the server, and it’s certainly not possible to know how much data will come back from the server for any given request. So in order to implement the HTTP API it was necessary to have a mechanism to deal with variable length data of any size, and at that time the only solution was to use dynamic memory.

Using dynamic memory worked fine, any data to be sent to the HTTP server as part of a POST request was placed into dynamic memory and the memory handle passed to the API, and any data returned from a GET or POST request was placed into dynamic memory by the API and the handle returned to the application. Dealing with variable length strings using dynamic memory isn’t particularly hard, but the fact of the matter is that while only a single line of code is required to perform an HTTP GET or POST, typically several lines of code were required in order to marshal data into and out of memory handles.

When the System.String class was introduced in DBL 9.1, so was the opportunity to simplify the use of the HTTP API, and that became a reality in DBL 10.1.

In order to maintain compatibility with existing code the HTTP_CLIENT_GET and HTTP_CLIENT_POST routines remain unchanged, but they are joined by two new siblings named HTTP_GET and HTTP_POST. These routines are similar to the original routines, essentially performing the same task, but they are easier to use because they use string objects instead of dynamic memory. And because the string class has a length property it is no longer necessary to pass separate parameters to indicate the length of the data being sent, or to determine the length of the data that was received. String objects are also used when passing and receiving HTTP headers.

So the new HTTP_GET and HTTP_POST routines make the HTTP API easier to use, but there is a second part to this story, so read on.

One of the primary use cases for the HTTP API is to implement code that interacts with Web Services, and in recent years a new flavor of Web Services called REST Services (REST stands for Representational State Transfer) has become popular. With traditional Web Services all requests were typically sent to the server via either an HTTP GET or POST request, but with REST Services two additional HTTP methods are typically used; the HTTP PUT and DELETE methods.

Many of you will be familiar with the term “CRUD” which stands for “Create, Read, Update and Delete”. Of course these are four operations that commonly occur in software applications. The code that we write often creates, reads, updates or deletes something. When designing traditional Web Services we would often indicate the type of operation via a parameter to a method, or perhaps even implement a separate method for each of these operations. With REST based web services however, the type of operation (create, read, update or delete) is indicated by the type of HTTP request used (PUT, GET, POST or DELETE).

To enable DBL developers to use the HTTP API to interact with REST services an extension to the HTTP API was required, and DBL 10.1 delivers that enhancement in the form of another two new routines capable of performing HTTP PUT and DELETE requests. As you can probably guess the names of these two new routines are HTTP_PUT and HTTP_DELETE. And of course, in order to make these new routines easy to use, they also use string parameters where variable length data is being passed or received.

You can find much more information about the HTTP API in the DBL Language Reference Manual, which of course you can also find on-line at http://docs.synergyde.com. In fact, if you’re feeling really adventurous you could try Googling something like “Synergy DBL HTTP_PUT”.


Unit Testing with Synergy .NET

By Steve Ives, Posted on at 11:02 pm

One of the “sexy” buzz words, or more accurately “buzz phrases” that is being bandied around with increased frequency is “unit testing”. Put simply unit testing is the ability to implement specific tests of small “units” of an application (often down at the individual method level) and then automate those tests in a predictably repeatable way. The theory goes that if you are able to automate the testing of all of the individual building blocks of your application, ensuring that each of those components behaves as expected under various circumstances, testing what happens when you use those components as expected, and also when you use them in ways that they are not supposed to be used, then you stand a much better change of the application as a whole behaving as expected.

There are several popular unit testing frameworks available and in common use today, many of which integrate directly with common development tools such as Microsoft Visual Studio. In fact some versions of Visual Studio have an excellent unit testing framework build in; it’s called the Microsoft Unit Test Framework for Managed Code and it is included in the Visual Studio Premium and Ultimate editions. I am delighted to be able to tell you that in Synergy .NET version 10.1 we have added support for unit testing Synergy applications with that framework.

I’ve always been of the opinion that unit testing is a good idea, but it was never really something that I had ever set out to actually do. But that all changed in December, when I found that I had a few spare days on my hands. I decided to give it a try.

As many of you know I develop the CodeGen tool that is used by my team, as well as by an increasing number of customers. I decided to set about writing some unit tests for some areas of the code generator.

I was surprised by how easy it was to do, and by how quickly I was able to start to see some tangible results from the relatively minimal effort; I probably spent around two days developing around 700 individual unit tests for various parts of the CodeGen environment.

Now bear in mind that when I started this effort I wasn’t aware of any bugs. I wasn’t naive enough to think that my “baby” was bug free, but I was pretty sure there weren’t many bugs in the code, I pretty much thought that everything was “hunky dory”. Boy was I in for a surprise!

By developing these SIMPLE tests … call this routine, pass these parameters, expect to get this result type of stuff … I was able to identify (and fix) over 20 bugs! Now to be fair most of these bugs were in pretty remote areas of the code, in places that perhaps rarely get executed. After all there are lots of people using CodeGen every day … but a bug is a bug … the app would have fallen over for someone, somewhere, sometime, eventually. We all have those kind of bugs … right?

Anyway, suffice it to say that I’m now a unit testing convert. So much so in fact that I think that the next time I get to develop a new application I’m pretty sure that the first code that I’ll write after the specs are agreed will be the unit tests … BEFORE the actual application code is written!

Unit testing is a pretty big subject, and I’m really just scratching the surface at this point, so I’m not going to go into more detail just yet. So for now I’m just throwing this information out there as a little “teaser” … I’ll be talking more about unit testing with Synergy .NET at the DevPartner conferences a little later in the year, and I’ll certainly write some more in-depth articles on the subject for the BLOG also.


The changing face of data.

By Richard Morris, Posted on at 3:02 am

With the eagerly anticipated release of Synergy/DE version 10 we now have the ability to track and manage changes to Synergy DBMS data files.  This new feature, Change Tracking, gives you the ability to track and monitor records within a file that have been modified, added or deleted.

Using the snapshot facility you can start a new snapshot within a file and begin your applications normal processing.  Any file updates are records against the current snapshot.  There can only ever be one active snapshot within a file so as soon as you start a new snapshot, all previous ones become an historical record of the changes made to the file.  In code you can access any existing snapshot within a file to trace the activity using the new Select class.

A recent development for a customer required code changes to a number of programs.  All the programs together defined a process and it was imperative that my changes didn’t break that process.  I wanted to be able to test the changes to the data after running each program and not just when I’d run all the programs for the process.  With Change Tracking this was a very easy task.  I simply created a new snapshot across all my files before running each program.  Using the Symphony Framework and CodeGen I created a simple program that displayed the records within a particular snapshot, so I could easily see what changes the program had made.  If a program didn’t do what I had expected (I intentionally coded in a bug just to check this!) I could simply rollback the current transaction, fix the program and continue.  I no longer have to restore the data and start my testing from the beginning.

You can see just how easy it is to utilise Change Tracking by watching a short video I’ve uploaded to YouTube.  Enjoy!

There are many applications for Change Tracking, and at this year’s DevPartner conference I’ll be exploring this new feature in detail – including having attendees help me demonstrate many of the capabilities!


Don't miss a post!

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts Tag Cloud Archives