Synergex
 
  Synergy-e-News
 
  News and updates for Synergy/DE Developers :: January 6, 2011
 
 
 



 
IN THIS ISSUE
 

Keeping to Code Standards: Code Formatting in Synergy/DE 9.5's Synergy Language Integration for Visual Studio (SLI)
By Marty Lewis, Software Engineer, Synergy/DE

That Prototyping Thingamajig: Understanding and Using a Quality Assurance Tool that You Can Bank On
By Tod Phillips, Senior PSG Consultant, Synergy/DE

Synergy/DE Tech Tip
Debugging segmentation violations or signal trap errors on UNIX with core files

Quiz
Synergy/DE pros, see if you can answer this question!

Did You Receive Your 2011 Synergex Calendar?
If not, let us know!

Platform News
Read a selection of recent articles

Announcing the Q4 Support Survey Winner
Is it you?

 
 
 

Keeping to Code Standards: Code Formatting in Synergy/DE 9.5’s Synergy Language Integration for Visual Studio (SLI)
By Marty Lewis, Software Engineer, Synergy/DE

The big news that everyone is talking about is the recent launch of Synergy/DE 9.5, which provides a plethora of features.  It includes the all-new Synergy .NET compiler, updates to Workbench, bug fixes, and the brand new SLI product.

What is SLI, though? SLI, or Synergy Language Integration for Visual Studio, is the cornerstone of Synergy .NET development efforts. SLI adds a full set of Synergy Language features to the Visual Studio editor, including code snippets, project templates and, of course, IntelliSense. In short, we now offer you extensive support for Synergy Language in Visual Studio, which enables you to create Synergy applications that run under Microsoft’s .NET Framework.

With so many new development features, it is certainly going to take a little time to master them all and figure out how to best apply them to your own development. But in this article, I want to draw your attention to one important feature that you’ll need to know about as you begin your development in Visual Studio: code formatting.

Why should you care about the most mundane feature that the editor provides? Undoubtedly, your company has some rules about code formatting. Maybe indents need to be a certain size, or perhaps begin statements should be placed a certain way. You may not think of this every day. If your editor is set up correctly, you need only concern yourself with writing code. You can leave these details to the editor. With Visual Studio, SLI adds formatting options that give you fine-tuned control over formatting for Synergy code. The sooner these options are set up correctly for your coding style, the sooner you can be productive in the new environment.

To find the formatting options, select Tools > Options from the menu. Then select the Text Editor category in the left pane.
Synergy/DE Text Editor
Scroll down to the Synergy/DE entry. This is where the formatting options for Synergy Language are located. Expand Synergy/DE and select Tabs.

The Tabs category controls the higher level indentation settings. The Indenting section at the top, innocuous as it looks, has primary control over all indentation for Synergy code. “None” turns off all indentation. “Block” uses simple block indents (in other words, a new line is automatically indented to the same tab stop as the preceding line). “Smart,” which is the default, is probably what you really want. Smart allows the fine-tuned formatting options (in the formatting pane discussed below) to control and format your code while you work.

Below the Indenting options are the Tab options to control the size of tabs, the size of indents, and whether to use tabs or just spaces for indents. The “Tab size” option controls the width of a tab character (in other words, it specifies the number of spaces represented by a tab character). Following that is the “Indent size” option, which controls the width of a single indent. While editing, the TAB key invokes an indent but does not necessarily insert a tab character unless whitespace the width of a tab is needed, as determined by the above options. You can prevent the insertion of tabs completely by selecting the “Insert spaces” radio button, or you can leave the behavior as-is by leaving “Keep tabs” selected.

Once you have the tab and indent settings where you want them, it is time to move on to the formatting pane by selecting Formatting from the left pane.

This formatting pane is a bit different from what you might be used to seeing in Workbench, but it does pretty much the same thing. In the upper-right section is a dropdown box with two default formatting schemes. These defaults are similar to the Style 1 and Style 2 options in Workbench, hence the names Style 1 and Style 2. Using one of these defaults is certainly a viable option, but the rest of the dialog offers you a concise, painless interface to customize your formatting options.

On the right is a formatting preview. While the preview doesn’t show every single language construct, it does show an example of just about every type of construct. The best way to see the preview work is to start selecting and clearing the options to the left of the preview. The preview will change as the options change, but options won’t be set for the code editor until you click OK. Once you click OK, code will be formatting using these settings as you type in the code editor, when you paste code into the editor, and when you run the document formatting option (Edit > Advanced > Format Document).

As for the options themselves, I’ll briefly describe them below. And note that for more information, press F1 while the formatting pane is active. This will open the help topic for these options.

The “Comments” section has two options. “Indent as statements” tells the editor to treat any comment in a procedure or data division that appears on its own line just like any other statement. Comments will be indented just as the statement “x = y” would be. The “Leave first column alone” option is the same, except that any comments that start in the first column will be left at the first column. This gives you some flexibility in comment formatting.

The “Labels” section has three options. The “Indent as statements” option instructs the editor to treat all routine labels like statements, rather than doing anything special with the indentation. “Align to first column” forces the labels to start at column 1, regardless of anything around them. “Align to containing routine” aligns labels to the same column as the PROC in which they are contained.

The “Lone Begin/End block” section has two options. This option applies only to begin/end blocks where the block does not immediately follow a conditional statement. The “Align contents to begin” option causes statements within the begin/end block to line up with the begin statement. The “Indent one level” option causes the statements to be indented one level from the begin statement.

The “Directives” section has two options. The “Indent as statements” option causes compiler directives (defines, includes, etc.) to be indented like statements. The “Align to first column” option causes compiler directives to start at the first column.

The “Conditional Blocks” section has two options. This option determines how begin/end statements are aligned in relation to a conditional block (if, while, for, etc.). The “Align begin/end to block” option causes the begin/end statements to line up with the associated conditional. The “Align begin/end to contents” causes the begin/end statements to be indented one level from the associated conditional.

The “Routines” section has two main options. The “No indentation in block” option causes all statements to line up with the routine’s proc and end statements. The “Indent contents of block” option causes all statements in both the data and procedure divisions to be indented one level from the proc statement. Selecting “Leave data div alone” aligns the entire data division with the routine.

The “Basic blocks” section has two options. A basic block refers to all code blocks other than BEGIN/END blocks, routine blocks, and conditions. It refers to code blocks such as namespaces, classes, records, and structures.. The “No indentation in block” option causes all of the block contents to line up with the block’s header. The “Indent contents of block” option causes all of the block contents to be indented one level.

Once you set up formatting options for your coding style, code will practically format itself while you work. You can then focus on mastering other great features available in Visual Studio and the .NET Framework.

For more information about SLI, refer to the Visual Studio online help (after you install SLI).

For more information about Synergy/DE 9.5, see the 9.5 Web site.


That Prototyping Thingamajig: Understanding and Using a Quality Assurance Tool that You Can Bank On
By Tod Phillips, Senior PSG Consultant, Synergy/DE

Building your application with the Synergy compiler is a lot like getting a loan at the bank. You know what you need, and what you want to do with it. There’s lots of review and double-checking on the bank’s side, however, before you get approval. And the less proof you provide that you can actually live up to your end of the bargain, the more it’s going to cost you in the long run.

We’ll Need Your Signature Here, Here, and … Here
If the compiler is the bank, then the Synergy Prototype Utility (dblproto) is the loan processor. This is the person who, working on behalf of the bank, communicates all of the documentation requirements you have, and then ensures that every ‘i’ is dotted, every ‘t’ is crossed, and every blank has been filled out with the correct information.  And just as the loan processor makes sure that every signature line is completed, dblproto reviews all of your subroutines, functions, methods and/or properties, makes note of their requirements, and then guarantees to the compiler that the signatures match when any of the routines are called.

For dblproto, every subroutine or function (or method or property) must have a unique signature.  A signature is comprised of a routine’s

  • namespace (a default is used if you haven’t explicitly provided it),
  • name,
  • parameters (including their data types),
  • return data type (subroutines always return a ‘void’ value).

Terms, Conditions, and Qualifications

By pushing it under the nose of dblproto before submitting your application, you’re simply asking that someone look everything over and verify that you’ve signed it all correctly…all with the goal of pre-qualifying your application so you can get the best terms possible.

Dblproto does its job by making sure you aren’t putting words where a number is supposed to go (like passing an alpha to a decimal parameter), you’re not trying to modify information that’s supposed to be inviolate (like attempting to overwrite an IN argument, or passing a literal to an OUT parameter), and that exactly the right number of blanks have been filled in (like calling a routine by passing too few or too many arguments).

In the end, the difference between compiling your code with or without strong prototyping is the same as the difference between asking the bank for a “standard conforming” loan versus a “no-doc” loan.  The bank might approve a loan without proof of income or employment, but they’re going to charge you a higher interest rate to do it. You might not notice too much of a difference month-to-month, but several years down the road you’ll probably look back and realize that the “no-doc” decision has cost you thousands – perhaps tens of thousands – more than if you’d been able to fully qualify yourself in the first place. It’s at that point that more prudent individuals will try to swap their no-doc loans for full-doc alternatives, thus securing themselves a lower price over time with a modicum of extra effort now.

The Benefits of Getting Qualified

The great thing about getting a standard conforming loan is that you can lock in a low interest rate for the life of the loan. Conversely, no-doc loans will often adjust to higher interest rates, and cost you more the longer that you have them.  It’s really the same for Synergy programs that don’t use strong prototyping: the time spent on debugging runtime issues can all get more expensive as time goes on.

Plus, remember that Synergy .NET never gives out “no-doc” loans.  If you can’t compile your code with strong prototyping enabled, then that code is simply not qualified to go to .NET.

Fortunately, getting your code “pre-qualified” is fairly easy. Unfortunately, if you’re compiling your code with anything prior to Synergy/DE 9, then you’re stuck in a “no-doc” situation; there is no prototype utility, and the compiler can’t check to make sure that you know what you’re doing. As a result, any pre-V9 compilation carries additional inherent risks that it will fail to do what it’s supposed to do at some point in the future.

If you can compile under version 9, however, then you’re already getting better “low-doc” terms – perhaps without even knowing it.  That’s because the version 9 compiler automatically prototypes all user-created functions and subroutines local to a single file, and verifies that they’re being called correctly (unless, that is, you’ve specifically and intentionally requested “no-doc” by using –qrelaxed compiler options). Furthermore, it will also validate the signatures on any calls to Synergy-supplied subroutines and functions.

For instance, the version 9 compiler knows that the signature of the Synergy GETLOG subroutine looks something like this:

void getlog(in a logical, out a translation, out n length

In English: the compiler knows that GETLOG is a subroutine and will therefore have a ‘void’ return value, and that three arguments are required to call it.  The compiler is also aware that the first two arguments must be of type ‘alpha’, that the last must be numeric, and that neither of the last two arguments may be read-only.

Before version 9, if you messed up and invoked a GETLOG call with

xcall getlog(decimalVar, “Hello”, alphaVar)
then the pre-V9 compiler would have been just dandy with it. Of course, upon actually executing your program, the runtime would have choked the moment it hit the erroneous xcall.  But what if the GETLOG call above was only called under a very complex and specific set of circumstances?  Debugging the issue (assuming it was ever even reported) could be next to impossible.  With the version 9 compiler, however, the above line of code would never have compiled in the first place:

%DBL-E-TYPPARM, Type mismatch for parameter 1 in routine getlog
%DBL-E-TYPPARM, Type mismatch for parameter 3 in routine getlog
%DBL-E-OUTPARM, Must be able to write to argument 2 because parameter was declared as OUT or INOUT
You can see that since the version 9 compiler already knows how to find the signatures of Synergy-supplied routines, you’re already “pre-qualifying” any line of code that calls them.  It automatically gives you some of the supporting documentation you need, and proves that your application has a lowered risk of failure. All that’s left is to qualify your own routines.

Getting Your Docs in Order

When you fill out an application for a standard conforming loan, you’ll need to provide the bank with supporting documentation. As the loan processor moves through the application, he’ll look at each section, find the supporting docs, and verify that the information you’ve provided for the application matches up to what you’ve documented.  In Synergy-land, the same thing will happen.

Since dblproto will be taking care of processing your loan, you’ll want to provide it with a small amount of information before it can get started.  It will need to know the name under which to file your supporting documents, where to put the documents that are being checked, and where to find documents that have already been checked.

SYNDEFNS sets the default namespace for functions and subroutines. It’s a good idea to set it at the system level and simply use your company’s name as the value. Note that if you’ve already done some work in this arena and have surrounded the code for your subroutines and functions with a namespace, like

namespace Flinstones
function fred     ,^val … function wilma ,a …
endnamespace

then whatever you’ve defined in your code will automatically take precedence over the SYNDEFNS value. Object-oriented code you’ve written will already have a namespace, and will therefore take precedence as well.

SYNEXPDIR sets the directory for which prototype files will be exported. When dblproto is executed, it will create one prototype file for every function or subroutine it encounters.

SYNIMPDIR tells dblproto where to find the “documentation” for routines that it’s already checked, and can either point to a single directory, or to multiple directories which are comma delimited.

(By the way: When you’re starting out, it’s easier to just set SYNEXPDIR and SYNIMPDIR to the same path. That way, all of your prototype files get dumped into the same directory, and the Synergy Prototype Utility always knows where to look for them when the compiler needs information. As you get more proficient, it’ll be a good idea to start separating the export directories based on the overall functionality of the routines that the prototypes encompass, and then set SYNIMPDIR to a comma-separated list of export directories.)

Submitting Your Application

Setting the appropriate environment variables will alert dblproto that you’d like to have it start processing your application. Getting started is as easy as finding a source file that contains one or more routines, navigating to its directory at the command line, and typing

dblproto <mySourceFileName>

The Synergy Prototype Utility will create a prototype file for each routine it finds, saving them in the directory defined by SYNEXPDIR. That’s it! You’ve just created your first piece of supporting documentation.

Of course, your next step is to make sure that the application can correctly cross-reference your documentation. The compiler implicitly uses the default namespace when looking for prototype files, so find the source file for any program or subroutine that actually calls one of your newly-prototyped routines, and try to compile it with

dbl <filename>

If there aren’t any errors, then the compiler was able to verify—through the prototype of your routine— that the calling method you used was correct.  You’ve just fully documented a piece of your application (at least as it relates to the use of your prototyped subroutine/function within the program you just compiled). Of course, you’ve only qualified a tiny part of your application—but dblproto has the ability to qualify whole chunks of behavior just as quickly.

Most people have a “base library” that contains many of routines common to their entire application. Often, this library (whether an object library, an executable library or a shared image) is built with a “build script.” If you’re working with one of these files, try editing it, and on the line just before it begins compiling your files, insert the following line:

dblproto source_directory:*.dbl

where source_directory is the logical or path that points to the source code for your library.

Save the file and then run the build script, and pay attention to the contents of the directory associated with SYNEXPDIR. You’re going to see a lot of files suddenly come into existence. These files are the supporting documentation for your application – at least for its base functionality – and you’ve just ensured that any compilation that accesses them is conforming to their requirements.

Also, if you got errors during the subsequent compilation, remember that these are good errors to know about!  Fixing the issues identified by strong prototyping may take several hours (or days), but doing so will improve the quality of your application and reduce its cost of ownership (read: support costs).  The minor effort involved in building a prototyped application is far preferable to the cost of keeping a “no-doc” solution!
As you’ve seen, the Synergy Prototype Utility is an extremely powerful yet incredibly easy-to-use tool.  There are several options you can add to dblproto, of course, but generally (and assuming your environment variables are correct) you won’t need or want to use them. Things will get a little more complicated once (and if) you start defining multiple namespaces; the compiler will need to be told which namespaces to look at if you use anything other than what’s set in SYNDEFNS. It’s easy to do by adding one or more ‘import’ statements at the top of the appropriate source files, but that’s a topic for another day.

For now, just try adding ‘dblproto’ to your build steps, and begin reaping the benefits of a fully-qualified application!

For more information about dblproto, see the Synergy Language Tools manual. You can access the Synergy/DE Manuals on the Synergex web site.

 


Synergy/DE Tech Tip
Debugging segmentation violations or signal trap errors on UNIX with core files

When an application is terminating abnormally with a segmentation violation or signal 11 error, a core file may help Synergy/DE Developer Support determine the cause.

A core file, or ”core dump,” will contain the state of the working memory of a computer program when the program has terminated abnormally, along with other information such as the processor registers and other processor and operating system flags and information. Two factors must be taken into account when generating a core file with a Synergy program: the system ulimit as it pertains to core file generation and the Synergy SIG_CORE environment variable.

To determine the size at which a system is set to generate a core file, use the ulimit -a command. Executing ulimit with the -a switch shows all current limits values on the system. On Linux, the default is 0, which means no core file is created.

To set the size of the core file, use the ulimit -c command. For example, to set the size of the core file to be unlimited, issue the command:

ulimit -c unlimited

Once you've confirmed that the system will generate a core file, make sure that SIG_CORE is set (to any value) in the environment. For example,

export SIG_CORE=on
By default, the user process’s core file will be generated with the name “core” to the current directory of the application; however, the location is configurable. If the system does not create the core file in the application’s current directory, use the command below to find all files on the system with ”core” in the name:

find / -name "core" -print>corefiles

This command creates a file named “corefiles” in the current directory, which will contain all file paths that include the text ”core”. To examine the file, enter

cat corefiles|more,

Once the core file is generated, send it to Synergy/DE Developer Support so we can analyze it and try to find the cause of the error.


Quiz
Synergy/DE pros, see if you can answer this question!

What is the output of the following program?

namespace confusion
  public class String
    public method String
    in req val, a
    proc
      m_val = val
    end
    public static method op_Explicit, a
    in req s, @String
    proc
      mreturn s.m_val.ToUpper()
    end
    private m_val, @System.String
  endclass
endnamespace

main
record
	s	,@String
proc
	open(1,o,"TT:")
	s = new String("abc")
	writes(1,(a)s)
end

Which answer is correct?

a. "abc"
b. "ABC"
c. A compilation error
d. A runtime error

Click here for the answer and explanation.


Platform News

Windows Unix

Did You Receive Your 2011 Synergex Calendar?
If not, let us know!

Synergex 2011 Calendar

If you did not receive your Synergex 2010 calendar, or would like additional complimentary copies, please email us at synergy@synergex.com. Include your address and the quantity you’d like, and we’ll send them your way.

Announcing the Q4 Support Survey Winner
Is it you?

If you’re Steve Sarapata from InstantWhip Foods, it is! Steve is the winner of the Q4 Developer Support Survey contest and the recipient of a $100 American Express gift card!

Since 1933, distributors of Instantwhip Foods have grown to become international manufacturers and distributors of fine dairy and nondairy refrigerated food products to the foodservice and bakery industry. Instantwhip Foods distributors provide products under many popular brands such as Instantwhip Cream, Instantwhip Topping, B/C Topping, Instantblend Coffee Creamer, Baker's Finishing Touch, Whip 'n Top Pastry Topping, YAMI Yogurt, Ambrosia® Gourmet Frozen Yogurt, Whip 'n Ice Icing & Filling and many others. InstantWhip Foods’ Synergy/DE-based applications run all aspects of their business, from inventory to billing and distribution to payroll.

Want a chance to win? Let us know what you think! Fill out a survey next time you work with Developer Support on an issue. We use these surveys to monitor customer satisfaction, and each quarter we choose a winner by randomly selecting the name of a customer who has completed a survey after working with Developer Support. So, next time you call on us for support, let us know how we did, and you could win $100 just for giving us your feedback.

Thanks to everyone who completed a Developer Support survey in Q4, and we look forward to hearing from you again.

If you do not currently have Synergy/DE Developer Support, contact your Synergy/DE account manager for more information.