Open Menu

Synergex Blog


CodeGen 5.1.6 Released

By Steve Ives, Posted on November 7, 2016 at 4:31 pm

I am pleased to announce that we have just released a new version of CodeGen with the following enhancements:

  • We modified the way that key loops are processed so that if a repository structure has a mixture of access keys and foreign keys defined, the foreign keys are ignored when processing key loops.
  • We added a new key loop expression <IF FIRST_SEG_NOCASE>.
  • We added four new field loop expressions <IF AUTO_SEQUENCE>, <IF AUTO_TIMESTAMP>, <IF AUTO_TIMESTAMP_CREATED> and <IF AUTO_TIMESTAMP_UPDATED> which can be used to determine if fields are defined as auto sequence or auto time-stamp fields.
  • We added two new key loop expressions <IF AUTO_TIMESTAMP_CREATED> and <IF AUTO_TIMESTAMP_UPDATED>.
  • We added two new key segment loop expressions <IF SEG_AUTO_TIMESTAMP_CREATED> and <IF SEG_AUTO_TIMESTAMP_UPDATED>.
  • We changed the behavior of the field loop expansion token <FIELD_TYPE_NAME> when used in conjunction with auto-sequence and auto-time-stamp fields.

This version of CodeGen is built with Synergy/DE 10.3.3a, requires a minimum Synergy runtime version of 10.1.1, and can be downloaded from here.


Wheel, Scroll, Oops.

By Richard Morris, Posted on October 14, 2016 at 6:45 am

If you answer “yes” to the following questions, then please read on: Do you have a Synergy UI Toolkit application? Do you use standard (not ActiveX) list processing with a load method? Do you run your software on Microsoft Windows 10?

Windows 10 offers a new feature that allows you to mouse over a list and use the mouse wheel to scroll the list, without the list actually getting focus. It’s a great feature, but if you have a standard list displayed in your UI Toolkit application which uses a load method – then that mouse-over scroll operation will attempt to “process” the list and cause the list load method to execute. Does not sound too bad – but if you have method data being passed through from the l_select() or l_input() routines then this data will not be passed to your load method, because you are not actually in l_select() or l_input(). Also, because the list has not gained focus you have potentially not been through your “my list is gaining focus so set up the load parameters” logic, which again means when your load method executes it’s in an unknown state.

When your load method executes in this “unknown” state and you try to access method data or your uninitialized load data then a segmentation fault may occur. The user uses the Wheel, the list attempts to Scroll and Oops your application crashes.

Thankfully, the Synergex team have found the issue and resolved it – and the fix will be in the upcoming 10.3.3b patch. If you are experiencing this issue today and need a resolution now, you can contact support who can provide you with a hotfix.


Examining and modifying Synergy data

By Richard Morris, Posted on September 5, 2016 at 11:41 pm

The Symphony Framework provides the ability to expose Synergy data as “Data Objects”. A “Data Object” or DO for short, is a class that exposes the fields of your repository structure as properties that can be accessed using Get and Set methods. These DO’s also provide access to the “raw” synergy record data through a property called SynergyRecord. The SynergyRecord property is the basic way to put record data into or get record data out of your DO. There are a lot of additional properties associated with DO’s, like its validity based on repository and custom validation rules, but those are for another blog. By exposing the individual field elements as Get/Set properties this allows us to bind to them from code (Synergy or C#/VB.Net) and in the WPF world the UI layer in XAML.

The Symphony Harmony framework allows you to select data from a Synergy DBMS file using standard SQL style syntax. Under the hood it uses the powerful SynergyDE.Select capabilities to access data in a file after parsing the SQL like query string. The data is returned as a collection of DO’s.  A simple example could be;

Select * from group

Where “group” is the name of a structure/file relationship in your repository. The Symphony Harmony returns all the located records in the form of DO’s. This means we can bind to the individual properties exposed by the DO.

Harmony can accept more complicated requests, for example;

SELECT Id ,Description ,Quantity ,Cost_price FROM part WHERE cost_price BETWEEN 10 AND 20

Is a valid query that will return only the DO’s that match the where clause.

To show the capabilities of the Symphony Data Object and Symphony Harmony I’ve put together a simple “Synergy DBMS Manager” program that allows you to interrogate and manage data in a Synergy ISAM data file. You can select data using SQL like syntax and display the results to the screen. You can also update, insert and delete records again using SQL like syntax. To show the query above using the Synergy DBMS Manager;

SDBMS1

The fields within the query can be typed in longhand or selected from the field list (retrieved dynamically from the DO). When the query is executed, the results grid is dynamically built to include only those fields selected.

As mentioned you can also modify the data in the file using an UPDATE command and you can insert new records using an INSERT command. To complete the capabilities you can also delete records using the DELETE command, but make sure you use a where clause or the whole file will be cleared.

The Synergy DBMS Manager application can be downloaded from SymphonyFramework.Net. You can also access full documentation on the same page or here. You will need a minimum of Synergy 10.3.1 runtime environment to run the Synergy DBMS Manager program. You will also need to create and build a library of Synergy Data Objects that represent your repository structure. This library is totally code-generated and full instructions are included in the documentation.

If you would like to take a look at the Synergy DBMS Manager but don’t have the time to build your library of data objects upload your repository files (zipped please) at: SymphonyFramework.Net and we’ll build it for you!

Why not take a quick look at the documentation to see how easy it is to use the Synergy DBMS Manager?

 


CodeGen 5.1.4 Released

By Steve Ives, Posted on July 29, 2016 at 10:42 am

We are pleased to announce that we have just released CodeGen V5.1.4. The main change in this version is an alteration to the way that CodeGen maps Synergy time fields, i.e. TM4 (HHMM) and TM6 (HHMMSS) fields, to corresponding SQL data types via the <FIELD_SQLTYPE> field loop expansion token. Previously these fields would be mapped to DECIMAL(4) and DECIMAL(6) fields, resulting in time fields being exposed as simple integer values in an underlying database. With this change it is now possible to correctly export time data to relational databases.

We also made a small change to the CodeGen installation so that the changes that it makes to the system PATH environment variable occur immediately after the installation completes, meaning that it is no longer necessary to reboot the system after installing CodeGen on a system for the first time.

This version of CodeGen is built with Synergy/DE 10.3.3a and requires a minimum Synergy runtime version of 10.1.1.


Replicating Data to SQL Server – Made Easy

By Steve Ives, Posted on July 28, 2016 at 5:29 pm

For some time now we have published various examples of how to replicate ISAM data to a relational database such as SQL Server in near to real time. Until now however, all of these examples have required that the ISAM files that were to be replicated needed be modified by the addition of a new “replication key” field and the addition of a corresponding key in the file. Generally this new field and key would be populated with a timestamp value that was unique to each record in the file. While this technique guarantees that every ISAM file can be replicated, it also made it hard work to do so because the record layout and key configuration of each ISAM file needed to be changed.

However, almost all ISAM files already have at least one unique key, and when that is the case one of those existing those keys could be used to achieve replication without requiring changes to the original record layouts or files. When this technique is combined with the capabilities of I/O hooks it is now possible to achieve data replication with only minimal effort, often with no changes to the ISAM files being replicated, and with only minimal modification of the original application code.

I am pleased to announce that I have just published a new example of doing exactly that. You can find the example code on GitHub at https://github.com/SteveIves/SqlReplicationIoHooks. Of course if you are interested in implementing data replication to a relational database but need some assistance in doing so, then we’re here to help; just contact your Synergex account manager for further information.


Symphony Framework Update 3.2.6 – File Management Capabilities

By Richard Morris, Posted on July 22, 2016 at 6:24 am

As I wrote about in my last blog the Symphony Harmony namespace provides the ability to execute select queries and stored procedure style method execution using SQL style syntax. The queries and procedures can be executed in-process or via the Symphony Bridge on a remote server via industry standard HTTP/HTTPS/TCP protocols as well as via a Service Bus Relay protocol. In the 3.2.6 release we have added the ability to fully manage the data in the SDBMS files via SQL this style syntax.

Accessing or selecting data through the Symphony Harmony namespace is simple. First you need to define your “connection” to the database. To do this you create an instance of the DBConnector class. For example;

data connector = new DBConnector(“SymLocal:user/password!SimpleHarmonyTestLibrary.TableMapper.MapTableToFile”)

The connection string defines how you want to access the data: SymLocal indicates that it’s in-process and specifying SymRemote indicates it’s via Symphony Bridge. If you are using Symphony Bridge you must include the “server” you wish to connect to – I’ll include an example shortly. The username and password allow you to provide authentication but that’s for another blog. The “SimpleHarmonyTestLibrary.TableMapper.MapTableToFile” is the named method that Symphony Harmony uses to translate the table name to a physical SDBMS filename, and it’s something you can code generate.

Here is an example of a remote access connection;

data connector = new DBConnector(“SymRemote:user/password@localhost:8081!SimpleHarmonyTestLibrary.TableMapper.MapTableToFile”)

Notice the server name and the port the Symphony Bridge server is listening on. Other than the connection string the usage of Symphony Harmony is identical regardless if it’s local or remote data access. Although here we are talking about local and remote data access – we are referring to where the code to perform the query is being executed, either locally in-process or via Symphony Bridge. We are not referring to the physical location of the SDBMS data files which may be local or remote and accessed via xfServer. Both local (in-process) and remote connections via Symphony Bridge can access data locally or via xfServer.

To select records from a file you use the SQL style SELECT syntax and call the DataSelect.RunDataSelect method. You need to instance the Symphony Data Object that will define the response data structure. For example;

data partItem = new Part_Data()

foreach partItem in DataSelect.RunDataSelect(connector, “SELECT * FROM part”, partItem).Result

begin

                Console.WriteLine(“Created ID = ” + partItem.Id + “, ” + partItem.Description + ” Qty = ” + %string(partItem.Quantity) + ” Cost = ” + %string(partItem.Cost_price))

end

Note here that the case of the command is not important however the case of string values within a where clause is. You can limit the fields returned, filter the data and order the results as you require. All of these are valid;

DataSelect.RunDataSelect(connector, “SELECT id, description, quantity FROM part WHERE quantity > 10”, partItem).Result

DataSelect.RunDataSelect(connector, “SELECT description, id FROM part WHERE description like ‘BRAKE’”, partItem).Result

DataSelect.RunDataSelect(connector, “SELECT cost_price, id, description FROM part WHERE cost_price < 1.99 ORDER BY cost_price “, partItem).Result

The order of the fields list is not important. You also don’t have to hard-wire the filter values as you can pass positional arguments. For example;

Data qtyValue   ,int         ,10

DataSelect.RunDataSelect(connector, “SELECT id, quantity, description FROM part WHERE quantity > :1”, partItem, qtyValue).Result

data descValue ,string   ,”BRAKE”

DataSelect.RunDataSelect(connector, “SELECT id, description FROM part WHERE description like :1″, partItem, descValue).Result

Data qtyValue   ,int                         ,10

Data costValue  ,decimal               ,1.99

DataSelect.RunDataSelect(connector, “SELECT id, description, cost_price FROM part WHERE cost_price < :1 AND quantity > :2 ORDER BY cost_price “, partItem, costValue, qtyValue).Result

All of the above will return a collection of data objects of type Part_Data().

As I mentioned above you can now maintain the data in your SDBMS files through Symphony Harmony. We have added the ability to insert data into an SDBMS file using Symphony Harmony. The DataInsert.RunDataInsert() method accepts an INSERT command and a data object and stores the passed-in data object data to the appropriate file. For example;

data partItem = new Part_Data()

partItem.Id = “ID1”

partItem.Description = “this is the description for part 1”

partItem.Technical_info = “lots of additional information for part 1”

partItem.Quantity = 5

partItem.Cost_price = 12.34

DataInsert.RunDataInsert(connector, “INSERT INTO part”, partItem).Wait()

The data in the partItem data object is extracted, placed into a Synergy record and stored into the SDBMS file. Any errors, like duplicate key, will be thrown on the client.

We have also added the ability to update the data in an SDBMS file using Symphony Harmony. The DataUpdate.RunDataUpdate() method accepts an UPDATE command, which should include a WHERE clause, and a data object, and updates the file with the passed data object data. You can optionally pass a list of fields that restrict the data to be updated. So for example if you application wants to simply update a field for a given record;

data numUpdated          ,long

partItem.Cost_price = 999.99

numUpdated = DataUpdate.RunDataUpdate(connector, “UPDATE part SET cost_price where QUANTITY = 13”, partItem).Result

Will update ALL records that have a quantity value of 13 with the updated cost_price value of 999.99. The resulting numUpdated field will contain the number of records that were updated in the file.

The field list is optional, but it’s recommended! If you don’t specify the field list then the whole record is updated, so you must ensure that the data object contains all the required information otherwise you may get exceptions if key values that are defined none-modifiable are changed. You may also inadvertently overwrite changes made by other processes.

And to provide the fully SDBMS data management capabilities we have added the ability to delete data from an SDBMS file using Symphony Harmony. The DataDelete.RunDataDelete() method accepts a DELETE command, which should include a WHERE clause, and a data object and deletes the matching records from the file.

data numDeleted            ,long

numDeleted = DataDelete.RunDataDelete(connector, “DELETE FROM part WHERE id = :1”, partItem, “ID1”).Result

Removes the required records from the file. There is no need to first select or lock them. The numDeleted field will contain the number of records that were deleted from the file. The where clause is optional so the following is valid;

numDeleted = DataDelete.RunDataDelete(connector, “DELETE FROM part”, partItem).Result

And will result in clearing the entire file – you have been warned!!

All this capability is now available in version 3.2.6 of the Symphony Framework, Symphony Harmony and Symphony Harmony Unplugged packages on Nuget.


Merging Data and Forms with the PSG PDF API

By Steve Ives, Posted on July 6, 2016 at 9:53 pm

When I introduced the PSG PDF API during the recent DevPartner Conference in Washington DC I received several questions about whether it was possible to define the layout of something like a standard form using one PDF file, and then simply merge in data in order to create another PDF file. I also received some suggestions about how this might be done, and I am pleased to report that one of those suggestions panned out into a workable solution, at least on the Windows platform.

The solution involves the use of a third-party product named PDFtk Pro. The bad news is that this one isn’t open source and neither is it free. But the good news is it only costs US$ 3.99, which I figured wouldn’t be a problem if you need the functionality that it provides.

Once you have PDFtk Pro installed and in your PATH you can then call the new SetBackgroundFile method on your PdfFile object, specifying the name of the existing PDF file to use as the page background for the pages in the PDF file that you are currently creating. All that actually happens is when you subsequently save your PDF file, by calling one of the Print, Preview or Save methods, the code executes a PDFtk Pro command that merges your PDF file with the background file that you specified earlier. Here’s an example of what the code looks like:

;;Create an instance of the PdfFile class
pdf = new PdfFile()

;;Name the other PDF file that defines page background content
if (!pdf.SetBackgroundFile(“FORMS:DeliveryTicketForm.pdf”,errorMessage)
    throw new Exception(errorMessage)

;;Other code to define the content of the PDF file

 

;;Show the results
pdf.Preview()

There are several possible benefits of using this approach, not least of which is the potential for a significant reduction in processing overhead when creating complex forms. Another tangible benefit will be the ability to create background forms and other documents using any Windows application that can create print output; Microsoft Word or Excel for example. Remember that in Windows 10 Microsoft has included the “Print to PDF” option, so now any Windows application that can create print output can be used to create PDF background documents.

I have re-worked the existing Delivery Ticket example that is distributed with the PDF API so that it first creates a “form” in one PDF file, then creates a second PDF file containing an actual delivery ticket with data, using the form created earlier as a page background.

I have just checked the code changes into the GitHub repository so this new feature is available for use right away, and I am looking forward to receiving any feedback that you may have. I will of course continue to research possible ways of doing this on the other platforms (Unix, Linux and OpenVMS) but for now at least we have a solution for the Windows platform that most of us are using.


Symphony takes a REST

By Richard Morris, Posted on at 3:08 am

The Symphony Harmony namespace allows access to data and logic through an SQL like syntax. For example you can select records from a file using a query such as “SELECT ID, DESCRIPTION FROM PART WHERE QUANTITY > 12 ORDER BY QUANTITY”. All matching records are returned from the query in the form of Symphony Data Objects. The data can be local or accessed via Synergy xfServer. The Symphony Bridge utility allows you to expose your query-able database via a standard Windows Communication Foundation (WCF) web service. So far so good.

Steve Ives and I recently had the opportunity to spend a week working together in the UK to “bang” heads together. Steve has always been an exponent of providing RESTful services to access logic and data which can be consumed by just about anything. So we set about using CodeGen to build a standard Restful service that will utilize the Symphony Framework to enable dynamic access to data and ultimately logic.

We soon had the basic service up and running. Out first implementation handled the standard GET verb – and returned all the records in the file. No filtering, no selection, just all the records retuned as a JSON collection. This is the standard API;

rest_1

Now remember that Symphony Harmony allows you to filter the data you are requesting, so we next implemented the ability to provide the “where” clause to the query. So for example;

rest_2

And using ARC (Advanced Rest Client which is a Google Chrome plug-in) we can test and query the service;

rest_3

And we get back just the selected customer details – all those customers where CUSTST has value of CA.

As well as being able to filter the data we can also limit the results returned by Harmony to just the fields we need; this has the benefit of reducing the data being brought across the wire. But how can our REST server build the required data objects to just include the fields we select? By doing runtime code generation! Within our code generated data objects we added the ability to dynamically build the response data object to only include those fields requested by the client. The calling syntax, as provided by the API, is;

rest_4

And again using ARC to test our server we can issue a command like;

rest_5

This is requesting all records from CUSMAS where the CUSNM2 field contains the word “LAWN” and limiting the response data object to just three fields. The response JSON looks like;

rest_6

Two perfectly formed data object that are limited by the fields in the selection list. If your Symphony Harmony connection to your data is via xfServer then only those selected fields will have been loaded from the file, again improving performance.

We also added the ability to limit the amount of data retuned by adding a “maxrows” option;

rest_7

We have already added the ability to the Symphony Harmony namespace to perform inserts, updates and deletes using standard SQL syntax and we’ll be adding these capabilities to the appropriate rest verbs POST, PUT and DELETE. Watch this blog feed for more information.


CodeGen 5.1.3 Released

By Steve Ives, Posted on June 30, 2016 at 1:11 pm

Tomorrow morning I’m heading back home to California having spent the last two weeks in the United Kingdom. The second week was totally chill time; I spent time with family and caught up with some old friends. But the first week was all about work; I spent a few days working with Richard Morris (it’s been WAY too long since that happened) and I can tell you that we worked on some pretty cool stuff. I’m not going to tell you what that is right now, but It’s something that many of you may be able to leverage in the not too distant future, and you’ll be able to read all about it in the coming weeks. For now I wanted to let you know that we found that we needed to add some new features to CodeGen to achieve what we were trying to do, so I am happy to announce that CodeGen 5.1.3 is now available for download.


PSG PDF API Moves to GitHub

By Steve Ives, Posted on May 28, 2016 at 11:05 am

This is just a brief update on the current status of the PDF API that I have mentioned previously on this forum. During the recent DevPartner Conference in Washington DC I received some really great feedback from several developers already using the API, and some very positive reactions from several others who hope to start working with it in the near future.

During my conference presentation about the API I mentioned that I was considering making the code a little easier to access by moving it out of the Code Exchange and on to GitHub. Well it turns out that was a popular idea too, so I am pleased to announce that I have done just that; you can now obtain the code from its new home at https://github.com/Synergex/SynPSG_PDF. And if any of you DBL developers out there want to get involved in improving and extending the API, we will be happy to consider any pull requests that you send to us.


Drilling for Data

By Richard Morris, Posted on April 20, 2016 at 5:44 am

One cool aspect of a Synergy UI Toolkit program has been list processing. When we introduced the concept of a list load method to populate the list as items were required was a huge step towards decoupling the program logic from the user interface. Because of this separation of concerns it’s actually very easy using the Symphony Framework to load a modern WPF data grid using an existing UI Toolkit list load method.

Our starting point has to be the UI, and that’s being rewritten in XAML. XAML allows data binding between class properties and the column elements of a data grid. We exposes our Synergy data by use of a Symphony Data Object. These data object classes represent your repository based data structures. The fields within your structures are exposed as the properties that the data grid will data-bind to.

Once we have the data grid defined we need to define the hooks between the collection of data we are going to expose to the data grid and the host programs list load method. First we call a generic method that will allow us to signal the required list loading requirements back to the host program. This snippet is going to call a generic method that will then raise the event back to the host:

d1

We are passing the load method name and then various parameters that define the actual data we are going to load and any additional list load method data. Now we can raise the load method request to the host DBL program:

d2

In the host DBL program we need to handle the event so we register an event handler:

d3

The following code snippet dispatches the load request to the existing UI Toolkit load method. There are two formats to the request depending if the list has associated list method data:

d4

So now we need to change the load method. If you have coded your list load method according to the standards laid out in the UI Toolkit manual there should only be a single line of code to change:

d5

The DoingTK test is a flag we have set to indicate if the program is running with the traditional UI Toolkit or our new WPF UI.

We shall be drilling down into the ability to handle list loading during the DevPartner 2016 pre-conference workshop as we all migrate an existing UI Toolkit program to a modern WPF user interface.


Synergex Celebrates 40th Anniversary

By William Mooney, Posted on April 15, 2016 at 10:45 am

20160414-banner5
We are excited and grateful to celebrate Synergex’s 40th year in business.

We started by delivering applications to local businesses in the Sacramento area 40 years ago today. Shortly after opening our doors, founders Ken Lidster and Mike Morrissey got frustrated with having to rewrite perfectly good applications to take advantage of the latest DEC (Digital Equipment Corporation) hardware and/or operating systems each time DEC came out with a new kit. This inspired them to create DBL (Data Business Language), which could take any version of DEC’s DIBOL language and simply recompile and relink to run on any DEC platform. This dramatically changed our—and ultimately our customers’—direction and put us on the map, as they say.

DBL was the first portable development language within DEC. In the ‘80s, Synergex (then named DISC) also became the first company to make it possible to migrate applications from a proprietary environment to the then emerging Unix/Xenix and MS-DOS systems. In the ‘90s, we enabled migration to Windows; at the turn of the century, to the web, APIs, and RDBMSs (SQL Server, Oracle, ODBC access, etc.); and in the teens, to .NET. Today we provide a migration path to mobile devices.

Forty years ago, very few of us, if any, could have imagined how today’s computing environment would look. Fewer still could have imagined that the applications they were developing would be able to run on all of these emerging platforms—but they can! Last week, I saw a DBL source code file dated 1976 run in .NET and on an Apple iPhone. Again, who could have imagined?!

Since our early shift to providing software development tools to application developers, our commitment to “portability” has never wavered. We have remained steadfast in our mission to deliver software tools and services to help our customers take advantage of current and relevant computing. Our products have been the ultimate “future proofing” to the significant investment customers have made in enterprise solutions.

And while many companies that were well known back in the day are no longer on the scene, we continue to grow and thrive more than ever—thanks in large part to our customers. Many of our ISV customers are the leaders in their respective vertical markets and continue to maintain their top positions and expand their install base. Some of our direct end users are household names with north of $100 billion in annual sales, who leverage  our tools to help them prosper. Without their support and partnership, I wouldn’t be writing this. So here’s a big shout out to all of our customers, with the biggest thanks there is.

Lastly, a company doesn’t grow and thrive without the dedication and hard work of its employees—both past and current. We have been—and continue to be—blessed to have very talented and driven people contribute to our success these past 40 years. Several, like myself, have been here for the lion’s share of our existence. But we also have many new employees who represent the future—and the next 40+ years of Synergex. Just as 40 years ago we couldn’t imagine what our industry would look like today, it’s impossible to envision what it will be like 40 years from now. Regardless, I feel very confident that Synergex will be here for our customers and their future generations, supporting whatever future computing environments come along.


STOP! Validation Alert

By Richard Morris, Posted on April 12, 2016 at 12:00 pm

It’s the tried and trusted way to get the users attention. At the slighting hint of an issue with their data entry you put up a big dialog box complete with warning icons and meaningful information.

c1

We’ve all done it, and many programs we write today may still do it. Writing code using the Synergy UI Toolkit its common practice to write a change method to perform field level validation. When the user has changed the data and left the field – either by tabbing to the next or clicking the “save my soul” button – the change method executes and validates the entry. It is here were we stop the user in their tracks. How dare they give us invalid data – don’t they know what they should have entered? It’s an ever so regimented approach. The user must acknowledge their mistake by politely pressing the “OK” button before we allow them to continue. Users are usually not OK with this interruption to their daily schedule so there must be a nicer way to say “hey there, this data is not quite as I need it, fancy taking a look before we try to commit it to the database and get firm with you and tell you how bad you are doing at data entry?”

When migrating to a new Windows Presentation Foundation UI we can do things a little different and guide the user through the process of entering the correct data we need to complete a form or window. We will still use the same change method validation logic however as there is no reason to change what we know works.

When using the Symphony Framework you create Symphony Data Objects – these classes represent your repository based data structures. The fields within your structures are exposed as properties that the UI will data-bind to. These data objects are a little bit cleverer that just a collection of properties. Based on the attributes in the repository it knows what fields have change methods associated with them. Because of this the data object can raise an event that we can listen for – an event that says “this field needs validation by means of this named change method”. Here is a snippet of code registering the event handler:

c2

The event handler simply raises the required “change method” event back to the host DBL program;

c3

Back in the DBL program we can now listen for the change method events. Here is the event handler being registered:

c4

Remember, we are now back in the host DBL code so we can now dispatch to the actual change methods registered against the field. This is a code snippet and not the complete event handler code:

c5

We are calling into the original change method and passing through the required structure data and method data. Inside the change method we will have code that validate the entry and then, as this snippet shows, we can perform the error reporting:

c6

If the code is running as a UI Toolkit program the normal message box dialog is used to display the message. However, when running with the new WPF UI the code provides the required error information against the field. No message boxes are displayed. To the user they will see:

c7

The edit control background is coloured to indicate an issue with the data and the tooltip gives full details of the problem. When the user has entered valid data, the field reverts back to the standard renditions:

c8

We shall be exploring the ability to handle field change method processing during the DevPartner 2016 pre-conference workshop as we all migrate an existing UI Toolkit program to a modern WPF user interface.

 


What’s on the menu today?

By Richard Morris, Posted on April 6, 2016 at 5:51 am

Menu processing is a fundamental requirement in any UI Toolkit program. Without menu columns and entries your program would not function. You also signal menu events from toolbars, and buttons placed on windows and lists. Once a menu entry of any sort is fired, even if it’s signalled from within the DBL code, there is usually a using statement that dispatches to the required segment of logic. When configuring a new Windows Presentation Foundation UI this technique is still valid.

The Symphony Framework UI Toolkit assists the developer to bridge the connection between the DBL host program and the commands associated with menu bars, toolbar, and buttons – basically any WPF control that can bind to a command, and that includes selecting an item within a data grid!

Before we head into how the Symphony Framework UI Toolkit helps let’s first look at some simple UI Toolkit script and code snippets. Firstly we can create menu entries in scripts:

m1

Here we have a menu column called “mfile” with an entry called “f_new”. The important element is the entry name “f_new” as this is the menu signal name that is used within the DBL code. You can create toolbars in code that will signal a menu event, for example:

m2

Here we have the same menu event name of “f_new” being signalled when the toolbar button is selected. We can also add buttons to tab containers;

m3

Windows and lists can also have buttons added to them both in scrip and in code;

m4 m5

If we need to enable or disable the “f_new” capability within the UI Toolkit application we need to do this for all instances, for example;

m6

The Symphony Framework UI Toolkit removes the verbosity of this structure of programming while retaining the same logical concept. Using the Symphony.UIToolkit.Menu.MenuController class allows you to create a command;

m7

Now we have the command we can bind it to any WPF control that accepts a command binding. So for a menu;

m8

We can utilise the same command on, for example, an Infragistics ribbon control;

m9

Using a regular tool bar;

m10

And of course as a regular button on an input form or data grid view;

m11

In the host DBL code we can now simplify the control of the commands. By enabling the single command it changes the executable status of the all UI controls that it is bound to;

m12

When the command is executed the toolkit menu entry is signalled to the host program. We need to define an event handler;

m13

This is a code snippet and not the complete event handler code:

m14

This code is setting up the required menu entry name and then calling into the host DBL code to process the menu select. In the host program we need to ensure that we dispatch to the menu processing code segment. In this example the original toolkit code was processing an input form. We continue to process the input window using the i_input() routine if the toolkit is still the active environment. If we are running using the new WPF UI then we simple process the menu event;

m15

We shall be covering menu, toolbar and command processing during the DevPartner 2016 pre-conference workshop as we all migrate an existing UI Toolkit program to a modern WPF user interface.

 


The stars have aligned—Microsoft’s plans now synchronize with Synergex’s founding principle of portability

By William Mooney, Posted on April 1, 2016 at 3:31 pm

It was very exciting to learn at Build this week that Microsoft will now be offering Visual Studio (VS) as a development tool to build anything to deploy anywhere—from Linux to Apple to Android, you name it! As Steve Ives referenced in his recent blog post, all of the Xamarin tools that we’ve been leveraging are now included with VS.

The really cool thing about all of this is that Microsoft’s direction is in perfect alignment with ours. At our Summit meeting at the end of last year—a gathering of Synergex customers who provide input on Synergex’s technology roadmap—we received overwhelmingly positive feedback on our proposal to include the ability to use traditional Synergy within Visual Studio.

More and more of our customers are adopting Synergy .NET while still maintaining and developing in traditional Synergy, leading them to work with two separate development IDEs: Workbench for traditional Synergy and VS for Synergy .NET. We are happy to say that this will all change with the release of Synergy/DE 10.3.3, pegged for OUR conference in May. The 10.3.3 release will allow developers to use VS for both traditional and Synergy .NET—the post-conference workshop is even dedicated to this topic! I’m even thinking that this new functionality might inspire even more developers to play/experiment with Synergy .NET once they are using VS as their single IDE.

Synergex has always been committed to helping companies leverage their existing investments to stay competitive and current. This latest announcement from Microsoft really echoes that sentiment. After this week, reaching for the stars is now easier than ever!

build

Page 1 of 1212345...10...Last »
RSS

Subscribe to the RSS Feed!

Recent Posts Categories Tag Cloud Archives