Software Archives | Synergex
Phone800.366.3472 SupportGet Support DocumentationDocumentation Resource CenterResource Center
search
close
Open Menu

Synergex Blog


Wheel, Scroll, Oops.

By , Posted on October 14, 2016 at 6:45 am

Avatar

If you answer “yes” to the following questions, then please read on: Do you have a Synergy UI Toolkit application? Do you use standard (not ActiveX) list processing with a load method? Do you run your software on Microsoft Windows 10?

Windows 10 offers a new feature that allows you to mouse over a list and use the mouse wheel to scroll the list, without the list actually getting focus. It’s a great feature, but if you have a standard list displayed in your UI Toolkit application which uses a load method – then that mouse-over scroll operation will attempt to “process” the list and cause the list load method to execute. Does not sound too bad – but if you have method data being passed through from the l_select() or l_input() routines then this data will not be passed to your load method, because you are not actually in l_select() or l_input(). Also, because the list has not gained focus you have potentially not been through your “my list is gaining focus so set up the load parameters” logic, which again means when your load method executes it’s in an unknown state.

When your load method executes in this “unknown” state and you try to access method data or your uninitialized load data then a segmentation fault may occur. The user uses the Wheel, the list attempts to Scroll and Oops your application crashes.

Thankfully, the Synergex team have found the issue and resolved it – and the fix will be in the upcoming 10.3.3b patch. If you are experiencing this issue today and need a resolution now, you can contact support who can provide you with a hotfix.


DevPartner 2015 – WOW!

By , Posted on May 15, 2015 at 6:37 pm

Avatar

That was the week that was the DevPartner 2015 conference in Philadelphia. Ok, so I’m biased but I really have to say this was one of the best conference weeks I’ve had the pleasure to be part of for many years. There were some really great sessions: The HBS customer demonstration rocked! They came to a conference a couple of years ago, did a tutorial on xfServerPlus and with this new found knowledge (and some PSG guidance) created a cool web bolt-on to their existing Synergy app.

We saw some fresh new faces from Synergex: Marty blasted through the Workbench and visual Studio development environments we provide and showed some really great tools and techniques. Phil gave us a 101 introduction to many of the “must know” features and capabilities of Synergy SDBMS – and of course was able to address mine and Jeff’s performance issues – you had to be there:). Roger demonstrated his wizardry to enlighten everyone as to the issues you need to consider when transferring your data within local and wide area networks – I was the bad router!

Bill Mooney set the whole tone of the conference with a great opening presentation showing just how committed Synergex are to empowering our customers with the best software development capabilities available.

My first day’s session followed and gave me the opportunity to demonstrate how you actually can bring all our great tools together to create true single-source, cross-platform applications which run on platforms as diverse as OpenVMS, UNIX and Microsoft Windows and onto a Sony watch running Google Wear!

Steve Ives went 3D holographic with videos from his recent trip to the Microsoft Build conference that showed just how amazing the Microsoft platform is becoming – and we aim to continue to be a first class player in that arena.

So many of our products are reaching a level of maturity that blows the competition away. Gary Hoffman from TechAnalysts presented a session showing how to use CodeGen and Symphony in the real world and showed just what you can achieve today in Synergy.

Jeff Greene (Senior .NET engineer @ Synergex) and I presented a rather informal (read written the night before) presentation showing the performance and analysis tools in Visual Studio 2015 that you can use to identify problem area and memory leaks in your application. Within minutes Brad from Automated System forwarded me an email he’d just sent to his team:

“At the Synergex conference just this morning, they just showed fantastic new diagnostics tools in Visual Studio 2015.  I just put the Team on the trail of potential memory issues with these new tools in a Virtual PC environment so we don’t alter our current developer stations. This could both reduce the memory footprint and improve performance.” – You can’t beat such instant feedback!

The tutorial time gives attendees the opportunity to play with the latest tools on a pre-configured virtual machine – plug in and code! And we continued the hands-on theme with Friday’s post conference workshop – where we built the DevPartner 2015 App from the ground up!

nexus_working

Thanks to everyone for coming and making the conference such a great success. It’s our 30th conference next year so keep your eyes and ears open for dates and details – it will be a conference not to miss!


Starting Services on Linux

By Steve Ives, Posted on July 24, 2010 at 3:14 pm

Steve Ives

For a while now I’ve been wondering about what the correct way is to start boot time services such as the Synergy License Manager, xfServer and xfServerPlus on Linux systems. A few years ago I managed to “cobble something together” that seemed to work OK, but I had a suspicion that I only had part of the solution. For example, while I could make my services start at boot time, I’m not sure that they were getting stopped in a “graceful” way during system shutdown. I also wondered why my “services” didn’t show up in the graphical service management tools.

My cobbled together solution involved placing some appropriately coded scripts into the /etc/rc.d/init.d folder, then creating symbolic links to those files in the appropriate folders for the run levels that I wanted my services started in, for example /etc/rc.d/rc5.d.

This week, while working on a project on a Linux system, I decided to do some additional research and see if I couldn’t fill in the blanks and get things working properly.

My previous solution, as I mentioned, involved placing an appropriately coded script in the /etc/rc.d/init.d folder. Turns out that part of my previous solution was correct. For the purposes of demonstration, I’ll use the Synergy License Manager as an example; my script to start, stop, restart and determine the status of License Manager looked like this:

#

# synlm – Start and stop Synergy License Manager

#

. /home/synergy/931b/setsde

case "$1" in

   

start)

        echo -n "Starting Synergy License Manager"

       

synd

       

;;

   

stop)

       

echo -n "Stopping Synergy License Manager"

       

synd –q

       

;;

   

restart)

       

$0 stop

       

$0 start

       

;;

   

status)

       

if ps ax | grep -v grep | grep -v rsynd | grep synd > /dev/null

       

then

           

echo "License Manager is running (pid is `pidof synd`)"

       

else

           

echo "License Manager is NOT running"

        fi

       

;;

   

*)

       

echo $"Usage: synlm {start|stop|restart|status}"

       

exit 1

esac

exit 0

If you have ever done any work with UNIX shell scripts then this code should be pretty self explanatory. The script accepts a single parameter of start, stop, restart or status, and takes appropriate action. The script conforms to the requirements of the old UNIX System V init subsystem, and if placed in an appropriate location will be called by init as the system runlevel changes. As mentioned earlier, I had found that if I wanted the “service” to start, for example when the system went to runlevel5, I could create a symbolic link to the script in the /etc/rc.d/rc5.d folder, like this:

ln –s /etc/rc.d/init.d/synlm /etc/rc.d/rc5.d/S98synlm

Init seems to process files in a run level folder alphabetically, and the existing scripts in the folder all seemed to start with S followed by a two digit number. So I chose the S98 prefix to ensure that License Manager would be started late in the system boot sequence.

This approach seemed to work pretty well, but it was kind of a pain having to create all those symbolic links … after all, on most UNIX and LINUX systems, run levels 2, 3, 4 and 5 are all multi-user states, and probably required License Manager to be started.

Then, almost by accident, I stumbled across a command called chkconfig. Apparently this command is used to register services (or more accurately init scripts) to be executed at various run levels. PERFECT … I thought! I tried it out:

# chkconfig –-level 2345 synlm on

service synlm does not support chkconfig

Oh! … back to Google… Turns out I was something really critical in my script, and believe it or not, what I was missing was a bunch of comments! After doing a little more research I added these lines towards the top of the script:

# chkconfig: 2345 98 20

# description: Synergy/DE License Manager

# processname: synd

Low and behold, this was the missing piece of the puzzle! Comments … you gotta love UNIX! So now all I have to do to start License Manager at boot time, and stop it at system shutdown is use the chkconfig command to “register” the service.

And there’s more … With License Manager registered as a proper service, you can also use the service command to manipulate it. For example, to manually stop the service you can use the command:

# service synlm stop

And of course you can also use similar commands to start, restart, or find the status of the service. Basically, whatever operations are supported by the init script that you provide.

Oh, by the way, because License Manager is now running as a proper service it also shows up in the graphical management tools, and can be manipulated by those tools … very cool!

Of course License Manager is just one of several Synergy services that you could use this same technique with. There’s also xfServer, xfServerPlus and the SQL OpenNet server.


Linux ls Color Coding

By Steve Ives, Posted on July 20, 2010 at 4:32 pm

Steve Ives

It’s always driven me CRAZY the way that RedHat, Fedora, and presumably other Linux systems apply color coding to various types of files and directories in the output of the ls command. It wouldn’t be so bad, but it seems like the default colors for various file types and protection modes are just totally unreadable … for example black on dark green doesn’t show up that well!.

Well, today I finally got around to figuring out how to fix it … my preference being to just turn the feature off. Turns out it was pretty easy to do, open a terminal, su to root, and edit /etc/DIR_COLORS. Towards the top of the file there is a command that was set to COLOR tty, and to disable the colorization all I had to do was change it to COLOR none. Problem solved!

Of course if you look further down in the file you’ll see that there are all kinds of settings for the color palettes to be used for various file types, file protection modes, etc. You could spend time “refining” the colors that are used … but personally I’m happier with the feature just GONE!


Learning from Stupidity

By , Posted on July 9, 2010 at 5:56 pm

Avatar

I'll be the first to admit that I've done some really stupid things in my life.

Like the time I decided to paddle a canoe across a mile-wide river even as threatening clouds loomed on the horizon.  Or the time I got stuck on a hike by taking a "short-cut" which involved shimmying around an overhang, leaving me suspended over a 70-foot drop to some very sharp, very hard rocks below.

Then there was the time I remoted into a production Alpha server and decided to shut down TCP/IP for a few seconds. Now that was fun; I figured my first month on the job was also going to be my last.

But all of these dumb moves — and many others — have at least one thing in common: Though I learned something, everything was over so quickly that I never had much time to worry about the repercussions.

Not so, yesterday.

My laptop had been causing me more and more grief lately, so I decided it was time to re-install the OS and start from scratch. I wasn't in a huge rush, however, and I had other things to do anyways. So it took me several days to complete my preparations for the wipe, during which time I methodically moved files from my laptop to a backup device.

Yesterday, before lunch, I declared my system ready for reinstall, and pulled the proverbial trigger. I reformatted the hard drive, installed a clean copy of Windows 7, and then ran Windows Update to get everything up to snuff. Success! I was really rocking, and I realized that if I hurried, I could get a full backup of my "clean" build before I left for lunch. So of course, I did something incredibly, unbelievably stupid.

Lesson #1: Do NOT destroy a completely valid, disk image backup to make room for a "fresh" disk image backup.

Turns out that my backup device — a 300GB external drive — was getting a little full. I'd been faithfully (more or less) doing disk image backups for quite a while, with the most recent being dated just last Friday. But those files were just SO BIG and I really needed the space for a new backup set.

My rationalization was pretty solid: I'd backed up copies of only those files that I needed, they were all organized well, and I had ISO images of all the programs I was going to need to re-install, so what's the point in keeping a backup of a system I'm never going to use again anyways?

Plus, I really needed the space.

So I deleted the disk image backup, started a new one from scratch, and went to lunch. Upon returning, the backup was complete. Moving right along, I quickly copied my well-organized backup files into place, and started with software installations.

Someone upstairs was watching out for me, however, because the first software I re-installed was a tiny little program that allowed me to access very special, very important and very irreplaceable encrypted files. And though it installed without a hitch, I quickly found that the encrypted files it opens…

…weren't there.

They weren't in the folders I'd copied back to my laptop, and they weren't on the backup drive. I searched network drives, network computers, and even checked a USB flash drive just against the chance that I'd momentarily lost my mind, transferred them there, and then forgotten about it. Perhaps the worst problem was that I had specifically made sure that those files had been backed up two or three days ago, and I knew everything was ok.

Hadn't I?

I finally gave up on locating them the "easy" way, and started downloading software that scanned hard disks to recover deleted files. After trying five different freebie versions, each of which were dismal failures, I'd almost given up hope. So just before midnight, I gave in and downloaded a try-before-you-buy piece of software called File Scavenger.

The demo version offers the ability to scan a hard drive and locate darn near everything that was ever on it and not overwritten, but only lets you recover 64K of a file before it asks you to pay. Knowing I'd happily pay the $49 if it worked, I downloaded and installed it. Upon running it, however, it looked as if it was going to take at least a couple of hours to scan whatever was left of my hard drive after the format/reinstall, so I decided to retire for the night and get some sleep.

Lesson #2: You can't sleep when you've probably just lost something that's irreplaceable.  (It's the Not Knowing and the What-If's that will keep snapping you back to full consciousness…again and again and again.)

Early this morning, I was back at my desk, with the knowledge that if the scan that I had left running was going to find something, it would have done so by now. I watched the ribbons bounce around on my laptop's monitor. I probably stared at them for a full minute before steeling myself for the worst, taking a last sip of coffee, and moving the mouse to break the spell of the screen saver.

There they were. I almost couldn't believe it. All three of the large, encrypted files that contained countless (or at least, well in excess of 150,000) other files.

Lesson #3: When pulled from a wallet too quickly, a credit card can cut through the pocket that holds it. Sometimes, it's safer for your wallet — and easier on you — to try the Pay-For "premium" solution before you waste hours hunting down the free alternative.

It was the fastest online purchase I've ever made. And within 30 minutes, I'd recovered all three files and confirmed that they were intact and had all of their information. I'd also backed them up (again) and re-confirmed their existence on the backup drive. I then put them on the hard drive of two other computers. Gun-shy? Absolutely.

But I've got to say, this software is amazing — and not just a little scary, too. While doing my scans of my laptop's hard drive, I found a lot of stuff that shouldn't be there. Like stuff that's been deleted for years. Doing a scan of my backup drive, a networked drive personal drive we use to keep copies of pictures, music, movies and (eek!) bank account information, and my little USB flash drive, I found lots and lots and lots of stuff that simply shouldn't be there.

Lesson #4 (Unintended): Deleted isn't deleted until you make it so.

Turns out that NTFS is really, really good at keeping your files intact even after they've been deleted — or even subjected to a quick re-format. FAT32 is as fragile as crystal by comparison, but still has the potential to leave a file intact long after you've deleted it. And while most everyone who's reading this already knows to use a disk utility to overwrite "unused" disk space before getting rid of a drive, remember that until you do, the data is likely still there.

And by the by…did you know that most printers and copiers have hard drives in them? Think twice before you donate or sell them, because the person that takes them off of your hands may have File Scavenger (or something similar) in their possession! With what I've learned — and now purchased — it brings a whole new world of (shady) opportunities to the table. For instance, my neighbor down the street actually has a bunch of printers sitting in his yard under a big sign that says "Take Me I'm Free" (no kidding). It's suddenly tempting to pick them up and take a little peek inside, but fortunately (for both of us) I don't have the time right now, as I'm heading out the door on holiday in only a few short hours.

Now, if only I could just learn

Lesson #5: Don't post a blog entry about your stupidity where the director of your department is likely to read about it

I could be reasonably sure that my job is still secure when I return from a well-needed vacation…

And yes: I'm going to Disneyland.


Don't miss a post!

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts Categories Tag Cloud Archives