Phone800.366.3472 SupportGet Support DocumentationDocumentation Resource CenterResource Center
Open Menu

Synergex Blog

An open letter to Synergy developer hiring managers

By Jacklin Garcia, Posted on May 2, 2019 at 12:58 pm

Jacklin Garcia

Dear Hiring Manager,

Congrats on your new Synergy developer hire! Recruiting isn’t easy, but now that you’ve found the right person for the job, the real hard work begins. The onboarding period should be used for training your new hire and setting their expectations about the job. This time is crucial, not only to ensure a smooth transition for your team, but also to make your new employee feel welcome. So where do you start?

Tell your Synergex account executive about your new hire

New Synergy developers in the DevPartner program are entitled to a welcome box full of goodies. This box contains educational materials like laminated cheat sheets as well as some fun Synergex-branded items. Don’t forget to mention the new hire’s T-shirt size when you reach out.

Download and modify the suggested 90-day plan

At Synergex, we use 90-day plans for all of our new hires. They are a key part of our onboarding strategy. We’ve created a template plan for you to download and modify from our education library. At the very least, we suggest grabbing the link for the new hire onboarding playlist on YouTube that is referenced in the plan and reviewing the other materials available in library.

Send your new hire to a Synergy DBL Language Essentials class

The best way to learn Synergy DBL is to take this class. Many developers prefer hands-on learning, and in this class, students will build a simple application in Synergy. You can either send developers to our headquarters for the standard five-day training or request a custom class at your location. If a class at HQ isn’t being advertised when you need one, that’s ok! Reach out to and we’ll see what we can do.

Have your new hire subscribe, join, or follow us on various platforms

We don’t like to bombard our customers with emails, so connecting with us on sites like YouTube and GitHub is the best way to stay in the know about new educational content. The Synergex Community is also the place for Synergy developers to get advice from their peers and make suggestions for Synergy/DE enhancements.  Bookmarking our online documentation is also a great idea!

Call support early and often

Our support team is incredible, and their services are free with your Synergy DevPartner subscription. Encourage new hires to call in (toll-free from the US and Canada 800.366.3472 or +1.916.635.7300 for all other countries) or email ( with any Synergy-related problems they may encounter. Our support team is made up of engineers who are more than capable of troubleshooting issues and helping you track down bugs in your codebase. They are also happy to report any product bugs you may encounter directly to our development team. 

Attend a Synergy DevPartner conference

Our conferences provide a dedicated time to learn and immerse yourself in the latest and greatest in the Synergy/DE ecosystem. It’s free for DevPartner subscribers, and it’s a unique opportunity for your new developers to get face-to-face interaction with their Synergy developer peers and the Synergex team.

If you need help identifying any additional educational materials, please reach out to us! Congratulations again on your new Synergy developer, and best of luck with the onboarding process.


The Synergex Education Team

What’s Up with Harmony Core?

By Harmony Core Team, Posted on March 22, 2019 at 2:49 pm


If you attended the 2018 Synergy DevPartner Conference in New Orleans, you probably remember hearing about Harmony Core. Harmony Core is Synergex’s open-source solution for allowing access to the data and logic in your Synergy applications. We had roughly six conference sessions on this topic, so if you weren’t at the conference or need a refresher, you can check out the videos. Since then we have been working fast and furious on our backlog, ideas that surfaced at the conference, and adjustments to Harmony Core based on early adopter experiences. In this post, we’ll attempt to shed light on some of our recent decisions, struggles, and successes. We also want to share our plans for the near future so you know what to look forward to and what to expect from us.

From the Backlog

When we announced the release of Harmony Core at the conference, we discussed the backlog of tasks we wanted to complete. In the months since we shared the following slide, many of these items have been delivered.

What's next

Later in this blog post, we’ll discuss some of the enhancements that have been implemented, but we’d like to start by mentioning one of the more important changes we’ve made for optimization: implementing sparse Select. Without this, Harmony Core could never realize its performance potential because of the extra data sent over xfServer connections. Fortunately, the implementation turned out to be fairly straightforward. Harmony Core simply detects whether a channel is remote by using %ISINFO. If the channel is remote, Harmony Core builds a sparse Select object by processing the supplied EF Core LINQ expression and searching for field and object references that are required to create the result. If the channel is local, Harmony Core won’t build a sparse Select object because the setup required for the object would incur a small performance cost.

Conference Conversations

At the conference, we talked with many of you about how we could improve and extend Harmony Core. These conversations were invaluable in helping us prioritize items in the backlog, and they got us thinking about how we could better take advantage of open standards. One thing we started thinking about was our custom protocol for Traditional Bridge, which is the Harmony Core component that exposes traditional Synergy logic. This custom protocol was well documented and made use of JSON, so it wasn’t difficult to understand. However, it certainly seemed odd to have something custom like this, given our focus on open standards. We wanted to find an open solution, and as we looked around, JSON-RPC 2.0 caught our attention due to its use as the base for Microsoft’s Language Server Protocol. Microsoft uses this protocol for remote procedure calls across standard in/out, which is very similar to our use case. And on further investigation, we found that there was an open-source project that implemented JSON-RPC 2.0 in .NET Core. It’s written by Andrew Arnott, a long time Microsoft employee with whom we’ve interacted extensively, making this an even more enticing choice.

Having a top-quality high-performance library available sealed the deal, and we set out to switch the protocol. After some limited changes to Harmony Core server code (written in traditional Synergy) we had a working JSON-RPC 2.0 server and were able to communicate with it using vs-streamjsonrpc. Using someone else’s client to talk to our traditional Synergy server gave us a great deal of confidence in our adherence to the standard and ensured that we won’t have any weird behavior if you want your server to interface with some other non-Synergy system using the same mechanism. This is just one of the many steps we’ve taken to make Harmony Core open and accessible for all of you.

Real-World Findings

Early on, a lot of our decision-making for Harmony Core was based on our experience developing and supporting xfServerPlus, xfODBC, and Symphony Harmony. Since the conference, we’ve been able to get Harmony Core into the hands of some early adopters, which has enabled us to make more targeted decisions based on real-world situations. We’ve learned a lot so far, and we’re looking forward to tackling any new scenarios that arise in the future. For now, we’ve got a solid list of improvements that should make the implementation process even smoother for future adopters.

Exposing Synergy Code as a Web Service

With our first early adopter, we exposed preexisting Synergy code as a web service. We had expected this to be a use case, but hadn’t yet worked through all the implications for the project layout and how to bring together all the CodeGen templates. We ended up making quite a few changes to the project template layout and tweaks to the CodeGen batch file included with the project template.

Initially, we weren’t sure whether we wanted to use .NET Framework or .NET Core for this project. Did we need .NET Framework AppDomains for code isolation? Or could we get away with the much more performant AssemblyLoadContext available with .NET Core? After extensive discussions, we determined that we needed only global/common/static isolation, so .NET Core proved to be a perfect fit. Since we were wrapping preexisting code that wouldn’t normally compile on .NET Core, we needed to make a .NET Standard version of the customer’s core libraries. This went pretty smoothly once we sprinkled a few .IFDEFs around to remove code related to WinForms and Windows services.

Once everything was running, we were greeted by a debugger that was behaving strangely. We looked at our isolated objects in the debugger, but the debugger wouldn’t display any values, and we weren’t able to expand any children. It turned out there was a bug in the .NET Core debugger that prevented evaluation of objects that existed in multiple AssemblyLoadContexts. This issue was reported to Microsoft and was fixed (in Visual Studio 2019). This marked our first of many successful Harmony Core implementations.

Querying with OData

After tackling all the bugs and features for our first adopter, we started to consider how best to expose search functionality. To implement things like search, we generally recommend using the Harmony Core EF Provider, which eliminates the need for programmer intervention. In this case, however, we couldn’t use the Harmony Core EF Provider due to the nature of the project’s file/data layout. A comprehensive search function was already built into the application, but we wouldn’t really be implementing OData if we just exposed custom methods that didn’t offer any OData query options. Thus, the idea for adapters was born!

The existing search had a particular set of operations, numeric ranges, field matches, “or field matches”, and arbitrary parameters. These sorts of operations can be expressed in a single flat data structure, so we created the dispatch machinery to convert OData queries into a single class instance and call the requested Synergy method, passing that instance to it. To maintain high performance when dispatching, we generated the dispatcher using low-level LINQ expressions to take advantage of their precompilation features. This means that Harmony Core does the complicated task of generating code to map an OData query to a Synergy data structure the first time the request is made. After that, it just runs the compiled code at full speed. This was a bit of a challenge to implement, but we think it will make web services much more performant.


After the conference, we picked up a few more early adopters, including several with repositories that hadn’t been used for xfODBC so they didn’t have any relations defined. As we worked with these repositories, we came to realize that Harmony Core could not describe a two-way relationship when a tag field was involved. A tag field is a literal that indicates record type when multiple record types are stored in a single ISAM file. Only foreign keys can specify tag literals, but repositories can’t have foreign-to-foreign key relations, which would allow for two-way relationships. (Repositories can have only access-to-access key relations and foreign-to-access key relations.) After some back and forth on this, we came up with an answer: enhance CodeGen. CodeGen now looks across relations defined for structures and glues both foreign keys together in a foreign-to-access/access-to-foreign key relationship. To do this, CodeGen uses either the foreign key naming convention or description fields for the foreign keys.

In addition to fun with foreign keys, we ran into some interesting EF Core limitations when describing the fields in a join. Synergy allows keys to be comprised of up to eight segments that can total up to 256 bytes. EF Core has a loosely similar concept, composite keys. When using composite keys to describe a relationship, however, the parts of a composite key must have matching types. This is not a restriction in Synergy, so we had to figure a way around this limitation. The solution we came up with was to add another type of field metadata: composite fields. These fields in the generated DataObjectMetadataBase class allow us to keep track of the size, position, and type of each supplied key segment, while exposing only a single dummy field to the EF Core Navigation property.

The work we did to create composite fields in metadata opened up an additional opportunity. When we introduced Harmony Core at the conference, it was possible to create calculated fields simply by defining a property with a getter that returned the data in question along with a setter that either threw an exception or wrote to the actual underlying field, depending on your use case. But Harmony Core didn’t yet understand which fields were required when selecting a calculated field while using sparse Select with xfServer for remote data access. Thanks to composite fields, this is now implemented. To take advantage of this enhancement, you’ll need to create a property in your DataObject’s partial class and implement a partial class for its corresponding DataObjectMetadataBase with a private void method called InitializeCustomFields. InitializeCustomFields just needs to call AddFieldInfo (which creates a composite field with the same name as the calculated field), passing references to the field metadata for each of the fields required to calculate the field.

Over the past few months, we’ve been polishing our templates, building out our documentation, and fixing bugs reported by our early adopters. With all these enhancements, Harmony Core should be a Synergy developer’s best bet for adding Synergy web services. Still not sure if this is the right choice for your applications? We’re working on so much more.

Our Plan Going Forward

Because Harmony Core is an open-source project, we want to share not only our technical roadmap, but also our goals for the project. We also want to set expectations up front so you can help keep us accountable and on track.

Technical Direction

There are so many things we can do to extend the Harmony Core framework, but we’ve chosen a few large items and a smattering of smaller ones to focus on over the next 12 months. We’ll continue to post blog updates as features are delivered and the direction evolves.


Webhooks are user-defined HTTP callbacks that are usually triggered by some event, such as a status update for an order, or when the quantity of something drops below a certain level. Webhooks are very useful when you have to glue two complex systems together. Imagine your company has grown by acquisition and you find yourself needing to merge the underlying data and business functions of two different IT systems. If you’re trying to build a unified RESTful view of both systems, your view is going to need to be notified when important events occur in the underlying system. Webhooks are the best way to do this. We think this sort of functionality will be crucial in many scenarios. If you have thoughts about how you would like to use this technology, please post on We aren’t certain what shape this support will take, but we will be actively pursuing this. Large (1-3 months)


While we’ve already implemented adapters, there is so much more we can do to extend their uses. Currently, adaptors apply only to filters, but we can further blur the line between operations provided by EF Core and custom code. This would entail implementing additional attributes to indicate that a particular field or collection needs to be expanded. In general, our design philosophy is to flatten OData queries into the sort of data structures that customers currently use to implement custom solutions. Supporting custom Synergy code is a very important part of Harmony Core; many of you have decades of very important business logic that you want to expose to the world. We would be doing you a disservice if the full OData query experience were not applicable to your important business logic. Adaptor enhancements will enable you to take full advantage of uniform OData query syntax, whether you’ve implemented an endpoint with custom code or are using the Harmony Core EF Provider. See Large (1-2 months)


Not all customer data is laid out the way we developers would like it to be. Sometimes this is for historical reasons, or it might be the result of business requirements. Whatever the reason, we want to make sure it’s possible to fully expose your data via OData and EF Core. This means we need to add at least some support for nested subqueries. The main use case we’re thinking of is when data for different branch locations is stored in separate files. In order to join across all branches, we will have to split the query into several parts, storing some intermediate results in a hash lookup table. While this doesn’t amount to support for completely arbitrary subqueries, we think it will solve most customer needs. See Large (1-3 months)

A Migration Path for xfServerPlus

Prior to the conference, we added support for reading xfServerPlus method catalogs. We also added initial support for generating the required wrappers and glue to allow existing xfServerPlus methods to be called using the JSON-RPC 2.0 protocol and the Traditional Bridge portion of Harmony Core. This needs to be extended to support all the data types and coercion that exist in the wild. See Large (1-3 months)

Durable Transaction Log

At the conference we rhetorically asked, “Is this ACID?” Spoiler alert: we’re pretty close but not quite all the way there. The missing piece of the puzzle is a durable transaction log, which is really just a fancy name for an on-disk data structure used as part of a two-phase commit. As transactions are being committed to the underlying ISAM files, Harmony Core would write a log of the operations that it’s performing with information on the state they are in should there be a hardware or software fault that interrupts ISAM operations. In addition to its verification duties, the transaction log would also enable us to undo any changes that were only partially committed. When combined with innovations at the operating system level (such as Storage Spaces Direct in Windows Server 2019) and ISAM resiliency in Synergy/DE 11, we think the transaction log will provide the foundation for highly available, fault-tolerant Synergy applications and web services. See Large (2-4 months)

Tooling Improvements

CodeGen integration with MSBuild – We’ve wanted to improve the process of building projects that use CodeGen templates. The degree to which we rely on CodeGen templates for Harmony Core projects is staggering. Keeping generated code up to date should be a build system operation rather than something for humans to manage (i.e., forget). Small (1-2 weeks)

CodeGen integration with NuGet – Delivering updates to CodeGen templates is tricky and manual. Those are a pair of attributes that should never be associated with tasks that must be performed frequently. Our answer to this problem is to ship template (.tpl) files using NuGet packages. This way, your build system can reference a specific version number for the templates you’re using. If updates are available, all you have to do is bump the requested version, and CodeGen will go download the requested package of templates. Small (1-2 weeks)

.NET CLI scaffolding tool – The command line interface for .NET Core introduced a concept known as scaffolding. Rather than shipping complex logic in wizards, project templates, or item templates, the dotnet tool allows you to offer a scaffolding generator. Conceptually, it’s very similar to a wizard in Visual Studio, but it’s much easier to work with and can be used without Visual Studio. Because Harmony Core relies heavily on CodeGen templates and source conventions, it’s currently more difficult than it should be to add new functionality to an existing project. Once we have a scaffolding tool, it will be much easier to add new endpoints, new structures, custom code in partial classes, and wrappers around existing code. Small (1-2 weeks)

Concurrency Improvements

Multiple xfServer connections – Currently the Harmony Core EF provider uses only a single xfServer connection. We need to build additional infrastructure into the provider to maintain a pool of worker threads that each have an open xfServer connection. And the FileChannelManager class will need some additional smarts to deal with request queuing and to use existing threads with requested channels. See Small (1-2 weeks)

Additional configuration options for FileChannelManager and, more generally, the pooling mechanisms – Currently we open file channels and never close them for the duration of a FileChannelManager object. This improves the performance of individual requests, but it’s wasteful in situations where there are hundreds of files, but few that are commonly accessed. The plan for this is to implement a least-recently-used (LRU) eviction strategy. Keeping track of the total list of all files that have ever been opened and the relative frequency of file opens should allow us to auto-tune the number of channels to keep in the LRU list. See Small (1-2 weeks)

Traditional Bridge SSH connection multiplexing – Currently, we open one SSH connection for each Traditional Bridge process on the remote server. For a very large number of processes, this is wasteful because connections are sitting idle most of the time. We can instead create up to 10 sessions per SSH connection (or more if the operating system is configured for it), which would enable us to have many more worker processes without the additional SSH overhead. This would be configurable for cases where SSH connection scaling isn’t the bottleneck. See Small (less than 1 week)

Better Debugging

Diagnosing issues in production is one of the most challenging tasks for a developer. We’re considering two features to make this less daunting: per-request logging levels and per-request Traditional Bridge debugging.

Per-request logging levels – Harmony Core offers extensive logging along with a logging framework that works well with Synergy data types. This is great in development, but in production it can destroy performance or collect sensitive data that shouldn’t be logged. By providing a custom HTTP header with a request, the logging level could be set specifically for that request. This means you could get full fat logging from a production server while the enhanced logging is turned off for the rest of your requests. See Small (less than 1 week)

Per-request Traditional Bridge debugging – Similar to per-request logging levels, we can use custom HTTP headers to cause the current request to launch a debuggable Traditional Bridge process. This would cause a separate .dbr and command line to be run on the remote system for this one request, enabling you to attach a debugger from Visual Studio to the running remote process, even in a production environment with thousands of normal requests being serviced concurrently. See Small (less than 1 week)

Harmony Core Goals

What will determine the success of this project? Ultimately, it will be you, the customer. So we want to make sure that implementing and interacting with the web services you build with Harmony Core is as painless as possible.

For 2019, our goal is to have at least four customers that have successfully deployed Harmony Core web services or are in the final stages of developing real-world solutions that make extensive use of Harmony Core. This is not as aggressive as it sounds; we already have four customers in various stages of development, and we aren’t just aiming to get four customers; we want those customers to be happy with the quality of Harmony Core and with the features it provides. To determine if these customers are happy, we’ll keep track of the net promotor score for all the different parts of Harmony Core. For example, if you have a complaint about the documentation or need more help than we expected in order to implement something, that’s a sign that we are missing documentation, examples, educational material, blog posts, or functionality. We’ll need to hear this from you to know how to do better. Once an issue is found, we’ll address it and follow up to determine if the solution actually worked. The number of adopters can’t and won’t be our only metric for success. Additionally, we anticipate that we’ll need to interact with non-technical decision makers and non-Synergy developers who in some cases will know best if Harmony Core is actually solving their business needs.

Open-Source Technology: The New Frontier

We feel that open-source development is a two-way street; in order to be successful, we need non-Synergexian contributors to create pull requests and issues on GitHub. We’re hoping for at least four non-Synergexian contributors to submit pull requests or issues by the end of 2019. The contributions can be for documentation fixes, new samples, or even new core functionality. We will tag GitHub issues that we think are a good fit for external contributors. For a non-tagged issue, we would ask that you give us a heads-up on GitHub to make sure we aren’t currently working on a solution for the issue. In general, we’re looking to add features and functionality that will be applicable to the majority of users, so we won’t want to incorporate code that is specific to a company or line of business.

What You Can Expect from Synergex

When it comes to open-source products, Synergex’s stance has generally been to provide the software as-is with no guarantees. For Harmony Core, however, you can expect Synergex to provide more robust support. Although the development path for Harmony Core is different than for our proprietary products, we want companies of all sizes to feel comfortable taking such a large dependency on Harmony Core. We think this can be accomplished using GitHub issues, but we’re looking to you, the Harmony Core community, to tell us what kind of service-level guarantees your management team will need.


Our Developer Support team is already training up on Harmony Core, and we plan to train them to write and understand CodeGen templates. There are quite a few new areas inside CodeGen that have been added as a result of the Harmony Core project, and you can expect the support team to have up-to-date knowledge on those enhancements. We will also begin looping Support into our ongoing development and deployment efforts with customers. We believe there is no training substitute for extensive guidance while getting your hands dirty on the inner workings of a product, and we think the team will be very well equipped to tackle and triage Harmony Core support cases in the near future.

In addition to providing help though Developer Support, we’re planning on incorporating some educational events and materials to ensure that you can successfully implement and interact with Harmony Core web services. We will be hosting office hours monthly for anyone interested in a chance to hop on a GoToWebinar session and ask questions or work through implementation issues. Office hours are a common feature in open-source projects run by companies who want make sure complex, difficult-to-solve issues aren’t ignored or missed or otherwise languish without solutions. You can also expect regular blog posts recapping the development progress that has taken place, and we’re interested in posting project post mortems (with identifying information scrubbed, of course). We think letting the Harmony Core community see the good, the bad, and the difficult in project implementation will give developers and management confidence that they will also be successful.


Open-source products are usually documented by their creators, and this will be true of Harmony Core. Our tech writing team will be involved (organizing, editing, and helping direct the documentation), but all documentation and samples will come from developers working on Harmony Core. This is a departure from how documentation has traditionally been developed at Synergex, but we think this will be the best way to document this project.

Meet the Harmony Core Team

Jeff Greene, Harmony Core Product Manager

Jeff GreeneJeff joined Synergex in 2006 and he brought his passion for open source and standards-based technologies with him.

His first contribution to our product suite was Synergy DBL Integration for Visual Studio, for which he architected and implemented items involving Intellisense, debugging, and migration tools. He championed the importance of running Synergy on .NET Core and was the lead developer of that project.

Jeff has been instrumental in Synergex’s embrace of Service Oriented Architecture (SOA) and standards-based technologies. When the need for a unified solution to creating web services for all Synergy/DE applications arose, Jeff and Steve Ives created the Harmony Core Framework with an emphasis on using standardized technologies such as OData, Swagger, Asp.NET Core, .NET Core and Entity Framework Core.

Jeff has worked with various departments within Microsoft over the years, and has made significant contributions to the open source Microsoft CoreRT project.

Steve Ives, Harmony Core Co-Architect

Steve IvesFollowing an initial background in operating systems, networking and systems administration early in his career, Steve realized that his real passion and talent was in the field of software development.

Having worked for one of the largest Synergy/DE developers in the United Kingdom, Synergex frequently contracted Steve to present on Synergex locally in Europe. These engagements consisted of presenting on the company and products during meetings, delivering training seminars on the use of the products, and consulting with other customers about how to address specific technology requirements.

Steve formally joined the Synergex Professional Services group in 1997. Having held the position of Senior Consultant for over 20 years, Steve has assisted many Synergy developers to architect, design and implement a wide variety of software projects. Each of these projects involves Synergy/DE in some way, and many involve a wide range of other development languages and technologies.

In recent years, Steve has been a major contributor to several Synergy-related open source development projects, including:

  • The CodeGen code generator which can be used to automate the generation of Synergy DBL and other types of code in a wide variety of software scenarios
  • The SQL Replication reference implementation—this provides a working example and most of the code needed to replicate a Synergy application’s ISAM and Relative data to a SQL Server database, in near-real-time
  • A PDF API that allows Synergy developers to easily add PDF file creation to their applications
  • The Harmony Core framework which provides Synergy developers the ability to quickly and easily build RESTful Web Service APIs to expose their Synergy data and business logic to a wide variety of applications and environments

Steve is passionate about assisting software developers to continue to leverage the investment that they have made in their Synergy code base over the years, and to continue to see a return on their ongoing investments in their code.

Johnson Luong, Harmony Core developer

Johnson LuongJohnson joined Synergex in 2014 as a QA engineer. He got his start by testing Synergy releases and the Synergex Resource Center Community, as well as verifying trackers and automating the testing of those trackers. You can read about his testing techniques and automated testing at Synergex.

In addition, after moving into a software development role, Johnson made many contributions to our Synergy DBL Integration for Visual Studio (SDI) product including migrating from Windows Forms to the WPF property pages and the product installers from InstallShield to a newer technology. He has since developed all of our Synergy product installers with the open-source language, WiX. He worked closely with Fire Giant, the custodian of WiX, to ensure the Synergy installers are using their most up-to-date technology.

Prior to joining Synergex, Johnson was an active member of the Association for Computing Machinery (ACM) at his alma mater. Along with his team, he even won their biannual International Collegiate Programming Contest. His background in test along with experience in Visual Studio development and solid foundational knowledge of algorithm design make Johnson the perfect addition to round out the Harmony Core team.

Highlights from the 2018 Synergy DevPartner Conference

By Synergex, Posted on January 14, 2019 at 3:26 pm


As we head into 2019, we reflect back on one of our favorite events of the year: the 2018 Synergy DevPartner Conference in New Orleans.

Packed with sessions on topics ranging from modern and agile development practices to RESTful web services to QA and learning culture, not to mention an engaging and illuminating keynote presentation from Microsoft’s Donovan Brown, the conference offered a plethora of info, ideas, tips, tricks, and plans for the future. Case studies and customer demos provided insight into practical implementations of technical concepts, and we were jazzed to introduce our new open source RESTful web services project, Harmony Core. Bourbon Street wasn’t too bad either! The legendary food, music, and culture of one of America’s most historic cities provided a great backdrop to a fun and productive week.

Here are some key takeaways from the conference:

  • Improve your productivity and practices by adopting more efficient development methodologies.
  • Enhance years of Synergy data and code with new technologies, including enabling connectivity through RESTful web services.
  • Security and disaster recovery are important for compliance—stay up to date with Synergy SSL and operating system security patches.
  • Use traditional Synergy in Visual Studio (it’s not as hard as it seems!) to significantly boost productivity, lower the barrier to continuous code integration, and improve your processes and software quality.
  • Move to the cloud. Developing and running your Synergy application in the cloud is relatively easy and provides a convenient path to expand your infrastructure to meet demands.
  • Education in the workplace is important—create an onboarding program that includes presentations, videos, and discussion.

The Synergex team came back to the office energized and ready to implement big plans for Synergy in the new year. We look forward to seeing you at the 2020 conference!

You can check out videos and slides of conference sessions here.


Using the Power of PowerShell with Synergy

By Jerry Fawcett, Posted on November 30, 2018 at 1:56 pm


Like it or not, change is inevitable. Now that you’ve mastered the command prompt, it’s slowly but surely being replaced by PowerShell. Actually, this process was started by Microsoft way back in 2002, which is how PowerShell came about in the first place.

Starting with Windows 10, the PowerShell command prompt has replaced the Command Prompt as the default. (To control this, toggle the Taskbar setting “Replace Command Prompt with Windows PowerShell in the menu when I right-click the start button or press Widows Key+X.”)

Comparing Command Prompt and Windows PowerShell

PowerShell Up

So you opened a PowerShell window and now everything you try results in copious output in ominous red text? Luckily for those of us who are learning PowerShell from the ground up, PowerShell comes with very complete and verbose help. Just type

get-help cmdlet

For example, you’d use “get-help where” to learn about the powerful where object. However, if you’re familiar with any OO language you’ll probably find PowerShell quite natural.

One would think PowerShell is a Windows-only tool, but surprisingly, Microsoft has made it open source, and it’s also available on macOS, CentOS, and Ubuntu.

You can use PowerShell together with Synergy in many ways, and I’d like to share some of what I’ve learned. Using the power (pun intended) of PowerShell can be helpful in getting information on Synergy products from a machine. Getting this information without PowerShell would be difficult—especially for a non-technical person—and potentially error prone.

By preceding a PowerShell command with “powershell” you can actually run it from a DOS command line without opening a PowerShell console, as the command is passed to the PowerShell processor as a parameter. However, by doing this, you’ll lose the awesome power of tab completion available in the PowerShell console. Try it: Open a DOS command window and type “powershell /?”.

A demo of get-help cmdlet

The PowerShell is Yours!

(At this point, I feel I must tell you that Synergy tools are currently not supported in a PowerShell environment. Ideas post, anyone?)
To list all Synergy products:

Get-WmiObject -ComputerName COMPUTERNAME -Class Win32_Product | sort-object Name | Select-Object Name |where{$ -like "*Synergy*"}
Output –
Synergy DBL Integration for Visual Studio 10.3.3f
Synergy/DE (X64) 10.3.3f
Synergy/DE 10.3.3f
Synergy/DE Online Manuals 10.3.3
Synergy/DE xfNetLink .NET Edition (X64) 10.3.3f

To list all Windows updates on a machine:

(New-Object -ComObject "Microsoft.Update.Session").CreateUpdateSearcher().QueryHistory(0,(New-Object -ComObject "Microsoft.Update.Session").CreateUpdateSearcher().GetTotalHistoryCount()) | Select-Object Title, Description, Date

If you need to retrieve all events from the Windows Event viewer that deal with the runtime, simply have the end user double-click on a batch file containing the command below and send you the file it creates (C:\temp\SynEventViewerEntries.txt).

powershell Get-EventLog -logname application -source dbr^|Format-List MachineName, Message, TimeGenerated ^|out-file C:\temp\SynEventViewerEntries.txt

To list all installed software on a machine,

Get-WmiObject -ComputerName FAWCETT -Class Win32_Product|Sort-Object name| Select-Object Name, Version

Just so you’re aware, when you get started, you may find no script will successfully run, because script execution is disabled by default. You can enable it as follows:

First, to verify the PowerShell’s execution policy, type


You’ll probably see

Default is Restricted

To enable execution, you must open the PowerShell command prompt as an Administrator and type

Set-ExecutionPolicy Unrestricted –Force

And just for fun, if— like me—you miss the Windows Experience score that was first introduced with Windows Vista but is now removed, you can use PowerShell to get your machine’s Windows Experience score. The Windows Experience score is a benchmark that measures your PC’s performance in five categories: processor, memory, graphics, gaming graphics, and hard disk performance. The score is the lowest of these benchmarks. To get your machine’s score, run

Get-WmiObject -Class Win32_WinSAT

If you want to learn more, you can search online. Every PowerShell topic I searched for resulted in a wealth of information.

More Power(Shell) to you!

Some good resources:

Microsoft Official PowerShell documentation

Getting Started with Windows PowerShell

PowerShell: The smart person’s guide

CodeGen 5.3.6 Released

By Steve Ives, Posted on September 23, 2018 at 4:43 pm

Steve Ives

Just a quick note to announce that CodeGen 5.3.6 has been released and is available for immediate use. This latest version represents the culmination of a series of recent releases which together have added significant new features across the entire product, including the ability to generate code based on metadata found in the xfServerPlus Synergy Method Catalog.

Many of the new features were added specifically to support code generation for the new Harmony Core RESTful web services framework that we are excited to be introducing at the upcoming DevPartner Conference in New Orleans next month.

If you are joining us for the post-conference Harmony Core workshop then you will need to have this new version of CodeGen installed on your development system; you can download it here. And even if you’re not intending to use the new Harmony Core framework, there are many new features that may be useful to all CodeGen users; we recommend this release for general use.

We’re looking forward to seeing you all in New Orleans between October 8th and 12th. If you haven’t signed up yet, it’s not too late! You can still sign up here.

Performance troubleshooting

By Phil Bratt, Posted on May 14, 2018 at 10:22 am


In the 1984 movie classic Ghostbusters, we are introduced to Bill Murray’s character Dr. Peter Venkman, a professor of paranormal studies, testing subjects for the gift of clairvoyance—the ability to gain information about an object, person, location, or event through extrasensory perception. While it is clear Dr. Venkman does not take such things very seriously, we can see the advantage of such an ability, particularly for developers.

Stop me if you’ve heard this before: “We upgraded X and now Y is happening.” In this case, Y is usually associated with a negative behavior like slow performance, a slew of errors, or a bad user experience. Events like these may induce weariness, nausea, dry mouth, and other various side effects that are usually listed in American pharmaceutical commercials. In summary, they’re unpleasant. If only there were some means to predict these events and avoid them in the future…

Unfortunately, most developers are not gifted with the power of clairvoyance to anticipate problems with upgrades before they happen. But maybe instead of a crystal ball, there are tools that can help us avoid upgrade failure. Let’s look at some things that can help anticipate and/or prevent such issues from happening.

A relatively recent addition to the Synergy documentation is the Configuring for performance and resiliency topic in the Installation Configuration Guide. This topic discusses things that one should take into consideration when running Synergy on any system, and it’s based on years of experience from the Synergex Development staff and the results of their testing. If you haven’t read this section yet, I highly recommend doing so. If you’ve already read it, I recommend a quick refresher the next time you’re looking at major system or software changes.

In Support, we often see issues when developers virtualize systems that run Synergy or when data is migrated and then accessed across a network rather than being stored locally. Both scenarios are discussed in this topic. And as part of Synergy’s web-based documentation set, it’s updated regularly with the latest information. Make sure you take a look before any major upgrade in case there are changes.

Other useful tools for avoiding problems are the Synergex Blog, the Synergy migration guides, KnowledgeBase articles like Guideline for debugging performance issues, and the release notes that come with each version of Synergy. Remember that even if you are just changing the operating system and/or hardware and not your version of Synergy, you should re-review these materials. Some of the considerations they outline may now be relevant, even if they didn’t affect you previously. Also, when testing, remember to take load testing or a process running over time into account. We commonly see pitfalls  when developers neglect these two factors in testing.

Taking Action

Now let’s say that despite your excellent planning, you do see a performance issue. What can you do? Here are some steps I’ve found helpful that might get overlooked. Most have to do with simply eliminating factors that affect performance.

  • Establish a baseline

If you’re going to diagnose a problem in performance, the first thing to do is isolate code or create a piece of code that demonstrates the problem. Make this your baseline test for all of the various configurations you’re going to test. This will make your tests consistent as well as eliminate code changes as a factor.

  • Use a metric

Establish a program you’re going to use to measure the difference in performance. If you’re using a traditional Synergy program as your baseline, you can use the Synergy DBL Profiler, which will count CPU time for you. Just make sure you pick the same metric for your testing—CPU time is not the same as real time. This step will enable you to get measurable results to test what is actually making a difference.

  • One by one, test all the things

I’ve found that the easiest way to plan and visualize testing is to make a tree. Each layer is one aspect you’re testing that continues to branch with every different aspect. For example, I had a situation where a production machine migrated Synergy version, operating system, and hardware and virtualized the OS, all in one move. We picked one thing to change (the virtualization of the OS) and tested it.



By doing this, we established that virtualization was a factor, because a virtualized environment was slower than a non-virtualized one. We then compared those to the old and new Windows versions, but continued with virtualized and non-virtualized environments using the same virtualization software.

Windows 8Windows 10
In previous tableIn previous table


On average, this produced the same result. (It was I/O processing, so we did an average of 10-20 runs based on how volatile the results could be.) Next, we compared the Synergy 10 runtime with the Synergy 9 one.

Windows 8Windows 10
Syn 9Syn 10Syn9Syn 10Syn 9Syn10Syn 9Syn 10
In previousIn previousIn previousIn previous


The tree continued growing until all of the factors were considered and tested.

Closing Thoughts

It can be tedious to test one change at a time, but without that kind of granularity, you can’t establish which change affected performance and by how much. In the example I mentioned above, we established that virtualizing the hardware was causing a problem because of the way the virtual machine software emulated separate cores. We never would have come to such a conclusion without carefully eliminating the many different changes one at a time.

After you’re able to establish exactly which changes caused the performance issue(s) and by how much, you can work on a fix or provide a solid case to whichever software support representative you need to contact to get a fix.

You might know most of this already. You might even know of some methods, tips, etc., for performance issues that I didn’t discuss. Maybe you are clairvoyant and you already knew the contents of this post before I did. Either way, I hope you find this information helpful when you look at performance in the future, in both preventative measures and problem diagnosis.

Big Code

By Richard Morris, Posted on April 13, 2018 at 7:13 am


If you have large (and I mean LARGE) blocks of code in single source files – and by large I mean 20k lines plus – then you may be having compiler issue with “SEGBIG” errors: “Segment too big”. This issue arises because your code segment is just too big for the compiler to handle and is usually the result of many man years of development to a single routine that has just grown over time.

If you encounter SEGBIG issues, as a customer I have recently worked with did, then this quick blog will give you some practical ideas of how to manage the issue and modify the code to allow for future development and expansion, without having to rewrite the world.

First off, it’s not the physical number of lines of code in the source file that’s the issue, it’s the lines of code and data definitions within each routine block: subroutine or function. Developers have encountered the problem for many years and the resolution has previously been to chop out a section of code, make it into a subroutine or function, and somehow pass all the appropriate data to it – usually by large numbers of arguments or common/global data blocks.

The “today” way is not too dissimilar but is a little more refined: turn the code block into a class. The first major advantage is class-based data. This removes the need to create subroutines or functions that accept large numbers of arguments, or to create large common or global data blocks. As an example:

subroutine BigRoutine


.include ‘’

record localData

localRoutineData      ,a10



call doSomeLogic

call doOtherLogic







Obviously this code will not give us a SEGBIG issue, but its an example of the structure of the code. The routine has a common data include and private data. In the routine body we make multiple local label calls. When there is too much data and too many lines of code added we will encounter a SEGBIG error.

So to address this, in the same source file, we can create a class with class-level data (the routine level data) and methods for the local call labels. So, for example:

namespace CompanyName

public class BigRoutineClass

private record localData

localRoutineData      ,a10


public method Execute, void







method doSomeLogic, void

.include ‘’




method doOtherLogic, void

.include ‘’






In this code, the Execute method becomes the entry point. All the existing code that made the label calls is moved into this routine and the calls changed to method invocations;



Then we can change the existing BigRoutine code;

subroutine BigRoutine



routineInstance       ,@CompanyName.BigRoutineClass



routineInstance = new BigRoutineClass()





Although the code changes I’ve described here sound monumental, if you use Visual Studio to develop your Traditional Synergy code the process is actually quite simple. Once you have created the scaffolding routine and defined the base class with class level data (which really is a case of cutting and pasting the data definition code), there are a few simple regex commands we can use that will basically do the work for us.

To change all the call references to class method invocations you can use:

Find: ([\t ]+)(call )([\w\d]+)

Replace: $1$3()


To change the actual labels into class methods, simply use the following regex:

Find: ^([\t ]+)([a-zA-z0-9_]+)[,]

Replace: $1endmethod\n$1method $2, void\n$1proc


And to change the return statements to method returns, use:

Find: \breturn

Replace: mreturn


These simple steps will allow you to take your large code routines and make manageable classes from them which can be extended as required.

If you have any questions or would like assistance in addressing your SEGBIG issues, please let me know.

Synergy/DE Documentation Reimagined

By Matt Linder, Posted on April 9, 2018 at 11:01 am

Matt Linder

If you’re a Porsche enthusiast, you probably know about the Nürburgring record set a few months back by a 911 GT2 RS, the latest iteration of the Porsche 911 (see the Wikipedia article). Like many, I find it interesting that the best* production sports car in the world isn’t a new design, but the result of continuous improvement and development since its introduction over 50 years ago. One company, Singer Vehicle Design, takes old 911s and resurrects them as carbon fiber kinetic sculptures in the aesthetic of the older 911s, but with performance that matches some of the fastest new 911s from the factory. They describe these as “Reimagined,” and you can see a video about a Singer 911 or visit the Singer website for more information.

Here in the Documentation department at Synergex, we’ve been doing some reimagining of our own. The Synergy/DE documentation has been continually improved over the years, but since 10.3.3c, we’ve published the Synergy/DE documentation with a new format, a new look, and a new set of underlying technologies and practices. The goal: documentation that is quickly updated to reflect product changes and user input, that is increasingly optimized for online viewing, and that is increasingly integrated with other Synergy/DE content. (And soon it will be better integrated with Visual Studio as well; see a recent Ideas post for details.) You can access the doc just about anywhere (even without internet access), it offers better viewing on a range of screen sizes, and it’s poised for further improvements. If you haven’t seen our new “reimagined” doc, check it out at

Here are some of the highlights:

  • A better, more responsive UI for an improved experience when viewing documentation from a desktop system, a laptop, or a tablet.
  • Technologies that facilitate more frequent updates and allow us to increasingly optimize content for online use.
  • Improved navigation features. The Contents tab and search field adapt to small screens, the UI includes buttons that enable you to browse through topics, and a Synergy Errors tab makes it easy to locate error documentation.
  • Quick access to URLs for subheadings. To get the URL for a topic or subheading (so you can share it or save it for later), just right-click the chain-link icon next to a heading and copy the link address.
  • The ability to print the current topic (without printing the Contents tab, search field, etc.) and to remove highlighting that’s added to topics when you use the search feature.
  • Local (offline) access. If you’re going somewhere where internet access is limited, download and install the Local Synergy/DE Product Documentation, which is available on the Windows downloads page for 10.3.3.

See the Quick Tips YouTube video for a brief visual tour of the documentation, and let us know what you think. In the footer for every documentation topic, there is a “Comment on this page” link you can use to send us your input. We look forward to hearing from you!

*Just an opinion!

A Winning Formula

By Richard Morris, Posted on February 15, 2018 at 3:28 am


For a recent project I’ve worked with a customer who wished to provide their users with an engaging desktop application that would allow management of product formulations.  They had a Synergy UI Toolkit version and also elements of the required application in a third-party system. However, neither met the needs of the users.  After a review and discussions about their requirements we agreed on a Synergy .NET Windows Presentation Foundation based application using Infragistics tooling for the User Experience.

The basic requirements of the application where to allow the creation and maintenance of formulations.  A formulation contains the components required to make a finished product.  For this customer the final product is an aerosol. 

Let’s take a look

The basic interface is built using the Infragistics control to handle navigation (Ribbon menu control), listing and selection of data (powerful DataGrid), hierarchical representation of the formulation components (TreeView) and management of finished product details (Property Grid);

Of course, using the Infragistics DockManager allows the user to drag and reposition all the available windows to their liking.

There are powerful searching facilities, or QBE (Query By Example) controls.  These allow the user to provide snippets of information and the application will query the Synergy DBMS database using Symphony Harmony and the Synergex.SynergyDE.Select class;

The top line of the QBE controls allow the user to enter the data in the columns they wish to search for and so only select the data they require and not have to filter through a list of thousands of formulations.

Because the application is written in Synergy, the existing printing capabilities from the original UI Toolkit application have been retained without change;

The whole application is written in Synergy .NET and utilises the Symphony Framework for controlling the data access and presentation.  If you would like more details, or would like to know how you can build modern applications with Synergy .NET please drop me an email.

What a Cracking Idea

By Richard Morris, Posted on February 9, 2018 at 5:52 am


The Synergex Resource Center Community site has a number of great features including the Answers and Ideas portals. Ideas is the place to post your ideas for improving Synergy and related products. You can vote for ideas and provide comment and feedback to help the whole community be more successful. The Synergex community site had an idea posted recently:

You can read the full idea at the Synergex Resource Center.

Synergy DBMS Manager

The Synergy DBMS Manager allows you to query and manage data in your Synergy data files using Symphony Data Objects.

Code generating Symphony Data Objects for your files is a simple process.  Ensure you have your Synergy Repository configured. You need to define the structure and associated file definition for each file you wish to be able to manage using the utility.  There are full instructions at the Symphony Framework page which walk you through the few simple steps to building your data object assembly in Synergy .Net.  Not sure how to build Synergy .NET assemblies – then send me your repository and I’ll do it for you!

To query data you simply issue the required select command and define the response data object;

Simple, and you can scroll through the results.  The idea posted was to be able to then export the selected data to Microsoft Excel so that further review and analysis can be performed.

A new toolbar button has been added and the GemBox Spread assembly used to easily create the Excel document.  Now you can export all the selected rows to an Excel document:

If you’d like more details, please visit the Symphony Framework page or contact me directly.




Protecting Data

By Richard Morris, Posted on February 7, 2018 at 5:06 am


Whenever I work with customers code there is almost always a need to “run” their applications.  That means I need data.  To ensure that Synergex protects our customers data and that we conform to the various data security requirements of todays world there is usually a need to cleanse the data before I get access to it.  This cleansing process is like redacting words or phrases in a document – to prevent the consumer (in this case me) from seeing the real data.

With Synergy today there are basically two ways to do this;  write a program to clear or set the data fields within the records in the file to a specific value or configure Synergy xfODBC and use a database management tool.  If you don’t want to write code, or license and configure xfODBC you can use the Synergy DBMS Manager.

The Synergy DBMS Manager is a simple utility that can be used to redact data in your Synergy DBMS files.  Your can download and install the Synergy DBMS Manager utility by visiting the downloads page.  The utility uses Symphony Data Objects to describe the data in your files – these are easy to code generate.

Code generating Symphony Data Objects for your files is a simple process.  Ensure you have your Synergy Repository configured. You need to define the structure and associated file definition for each file you wish to be able to manage using the utility.  There are full instructions which walk you through the few simple steps to building your data object assembly in Synergy .Net.  Not sure how to build Synergy .Net assemblies – then send me your repository and I’ll do it for you!

Synergy DBMS Manager

Once you have the data object library built, you simple run the utility:

You’ll need to locate the “Data object assembly” that you have just built, and following that select the “Table mapping method”.  Enter the password and you’ll be in the utility and ready to manage your Synergy DBMS data.

You can perform simple queries to locate and review the data in a file;

Now you can easily redact the data in the file – remember you are affecting the ACTUAL data in the file, so make sure this is a copy of the data file and NOT your LIVE data, there is no rollback functionality!

Simply issue an update command;

Notice the result count () at the bottom of the screen.  Because we didn’t specify a where clause then all records in the file were affected by the requested update.  Now you can perform a simple query and see the results;

This data is now cleaned.  If you need to send data to Synergex Support to assist then to resolve an issue you have, this is a great way to redact and protect that data before sending it.

If you’d like more details, please visit the Symphony Framework page or contact me directly.

CodeGen 5.2.3 Released

By Steve Ives, Posted on December 1, 2017 at 11:18 pm

Steve Ives

We are pleased to announce the release of CodeGen version 5.2.3. The new release includes new features, addresses some issues found with previous releases, and also paves the way for once again being able to use CodeGen on non-Windows platforms through experimental support for the .NET Core environment.

As always, you can download the latest version of CodeGen from here.

CodeGen Version 5.2.3 Release Notes

  • Added a new experimental utility to the distribution. The Code Converter utility can be used to automate bulk searches within and bulk edits to an applications code. This utility is in a usable form. However, it is still a work in progress and is likely to undergo substantial changes as it evolves.
  • Added two new utility routines (IsDate.dbl and IsTime.dbl) that are referenced by some of the supplied sample template files.
  • Corrected a regression that was introduced in the previous release which caused the field loop expansion token <FIELD_SQL_ALTNAME> not to default to using the actual field name if no alternate name was present.
  • Performed an extensive code review and cleanup, updating the code in several areas to take advantage of new features available in the Synergy compiler, and also improving efficiency.
  • Fixed an issue that was causing the CreateFile utility -r (replace file) option to fail, an existing file would not be replaced even if the -r option was specified.
  • Fixed an issue in the CreateFile utility that would result in an unhanded exception in the event that invalid key information was passed to XCALL ISAMC.
  • Made some minor code changes to allow CodeGen to be built in a .NET Core environment and we hope to be able to leverage .NET Core to once again support the use of CodeGen on non-Windows systems (starting with Linux) in the near future.
  • This version of CodeGen was built with Synergy/DE 10.3.3d and requires a minimum Synergy version of 10.1.1 to operate.

Symphony Framework Components

  • We no longer ship the Symphony Framework sample templates with CodeGen. You can obtain the latest Symphony Framework templates from the Symphony Framework web site (
  • There were no Symphony Orchestrator changes in this release.
  • There were no Symphony Framework CodeGen Extensions changes in this release.

CodeGen 5.2.2 Released

By Steve Ives, Posted on October 25, 2017 at 3:00 pm

Steve Ives

We are delighted to announce the availability of the CodeGen version 5.2.2 release that includes the following enhancements and changes:

CodeGen Version 5.2.2 Release Notes

  • Added a new field loop expansion token <FIELD_FORMATSTRING> which can be used to access a fields format string value.
  • Added a new command-line option -utpp which instructs CodeGen to treat user-defined tokens as pre-processor tokens. This means that user-defined tokens are expanded much earlier during the initial tokenization phase, which in turn means that other expansion tokens may be embedded within the values of user-defined tokens.
  • Removed the RpsBrowser utility from the distribution; it was an experimental project that was never completed.

This version of CodeGen was built with Synergy/DE 10.3.3d and requires a minimum Synergy version of 10.1.1 in order to operate.

“D” is for Developer

By Richard Morris, Posted on September 20, 2017 at 6:55 am


Development of your traditional Synergy code in Microsoft’s Visual Studio was introduced at the DevPartner conference back in 2016. Using an integrated development environment (IDE) like Visual Studio not only promotes better code development practices and team development but shows prospective new hires that your tooling is the latest and greatest.

The next release of Synergy—10.3.3d—includes all the capabilities now required to fully develop and build your traditional Synergy–based applications in Visual Studio. During a recent engagement I worked with a team of developers to migrate their existing Synergy Workbench development to Visual Studio with great results. Although there are a few steps to complete, the results of developing in Visual Studio more than outweigh the effort taken to get there. And if you think developing in Synergy Workbench is great and productive, just wait until you are using Visual Studio—there will be no turning back! Here are the high-level steps you can take to get your traditional Synergy development to Visual Studio.

First place to start is the Synergy Repository. We all have (or should have) one. Synergy now provides a Repository project that will allow you to build your repository files from one or multiple schema files. If you have multiple schema files you can use the new pre-build capability to run your existing command scripts to create the single, ordered schema file or load the repository your way—simple. So why have the Repository project? Because you can load all your individual schema files into it, and if any change, your repository will be rebuilt automatically.

Next create your library projects. These are either executable (recommended) or object libraries. Ensure you reference the Repository project using “Add Reference…”. You no longer define the Repository environment variables “RPSMFIL” and “RPSTFIL”. This step ensures that if your Repository project is rebuilt, any projects referencing it will be as well. Next add the source files for the routines that make up your library, and build. You may have a few build issues to resolve—the 10.3.3d compiler is a little stricter, and unresolved references will need to be resolved. Any environment variables required to build your software should be set in the project common properties page or if they are library specific in the project environment page.

Finally, your main line programs. Create the required project with single or multiple main line programs. The multiple main line project allows you to have all the programs in one place, and you can easily specify the program to run.

Now you can build and run your traditional Synergy code from Visual studio—and even better, you can debug through the code using the powerful Visual Studio debugger.

Using UI Toolkit? Keep a look out for a future blog post showing how to easily incorporate window script file builds into your development process.

Building for UNIX? Not a problem. A future post will show the simple steps to target the UNIX platform from within Visual Studio.

We are here to help! Synergex can help with every aspect of getting your traditional Synergy development environment inside Visual Studio. Just ask your account manager or contact me directly.

Don't miss a post!

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts Categories Tag Cloud Archives