Phone800.366.3472 SupportGet Support DocumentationDocumentation Resource CenterResource Center
search
close
Open Menu

Synergex Blog


Airline Baggage Roulette

By Steve Ives, Posted on August 8, 2010 at 7:07 am

Steve Ives

Well it's another day in United Airlines "friendly skies" for me. I'm setting out on a week-long treck around the East Coast and Mid-West to visit several Synergy/DE customers with one of our sales account managers, Nigel David. It takes a LOT to make me smile when leaving home at 4.am on a Sunday morning, but today that's exactly what hapenned.

I walked in to the airport terminal at Sacramento "International" Airport, and I couldn't help noticing that one of the baggage carousels had been somewhat "jazzed up" (see photo). I just had to smile … I don't often check bags but it seems like every time I do there is some kind of issue. Now, at least, it seems like this airport has decided to be honest about the state of things … checking a bag with the airlines is somewhat like a game of roulette! OK, maybe the odds of getting a checked bag back on time may be a little better than one would expect in a casino … but not much!


Giving people what they want

By William Hawkins, Posted on August 6, 2010 at 12:53 pm

Avatar

I just spent the past few days in Seattle attending Visual Studio Live! 2010. If you’re a regular follower of the Synergex PSG blog, you may have the impression that we swan around the world doing nothing but attending conferences. While I'd like that to be true, it's really in an effort to provide our customers with consultants that are more knowledgeable in non-Synergy technologies, Synergex has expanded both the number of people that we send to conferences and the number of conferences that we attend.  For example, this week I was accompanied by Jerry Fawcett and Steven Lane—both from our support team. And, in my defense, this is only the third conference that I’ve been to (as an attendee) in a decade and a half.

Anyway, I digress. The VS Live! conference was hosted on the Microsoft campus in Redmond, but the speakers were mostly non-Microsoft peeps. There was a keynote presentation on two of the days from Microsoft peeps, both of which were delivered in a very professional manner, but (with a few notable exceptions) I was disappointed by the presentations. The presenters obviously knew the topic matter, but had not coordinated session content very well, or made sure that the infrastructure was setup (doing a demo on Cloud/Azure technology requires a properly configured internet connection) or even made sure that the presentation slides were in the order in which they wanted to present them. Even little things like closing unnecessary toolbars/windows in Visual Studio, or using an appropriate font to show code, impacts the perceived quality of a presentation.

For years, Bill Mooney has been telling PSG that he considers the SPC presentations amongst the best he seen. I always just put that down to his enthusiasm, but after this week I can see what he’s getting at. I know I’m far from the world’s best presenter, but after this week I’m starting to appreciate the level of professionalism with which we’ve been providing Synergy customers at our conferences over the years.

Last week, the PSG team met to go over the SPC 2010 conference content, and we all gave an overview of our planned sessions. Most of us were still in the planning stage of content development, but I had mostly completed my allotted sessions. So I thought I was pretty much done. But, as they say, no good deed goes unpunished, and I managed to get myself allocated a new 3 hour tutorial session to prepare.

The SPC this year promises to be another great conference with lots of hands on (and a reappearance of the CodePhantom), and I hope you will come see us in Sacramento.


Gearing up for another great SPC

By William Mooney, Posted on August 4, 2010 at 1:47 pm

Avatar

It’s been a while since I posted a blog, but the SPC always seems to propel me back into the blogosphere! SPC 2010 is unbelievably just around the corner, and we are once again gearing up for a conference that is not-to-be-missed.

A few years ago, a customer asked me to help him justify the conference to his team and upper management. I quickly jotted down the top reasons to attend the SPC and ended up sending the list to all of our customers. As the reasons haven’t changed much since then, I won’t bore you by repeating them all but think the main ones deserve a recap.

Here goes…

  • Continuing education. Imagine going to your heart doctor for a check up and learning that he or she has not been to an industry-related conference for several years. How has he/she kept up with all of the advances in the technology? Reading journals and surfing the net? Wouldn’t you prefer he/she had a more well-rounded education including hands-on instruction, networking with peers, and one-on-ones with industry experts? Likewise with your software—make sure your “application doctors” are getting the best education possible.
  • Break away from your day-to-day routine. One of our customers said about the SPC, “The SPC gives me a chance to escape the hustle and bustle of business and think strategically—I use it like a software development retreat.”  I have to agree. More often than not I solve problems or come up with new ideas when I’m away from the office on a business trip or seminar. And, your employees who attend will come back motivated and inspired by your confidence in sending them. I know this first-hand from the responses I get from my employees when I send them to conferences.
  • Learn about the future of Synergy/DE. Version 9.5 will offer native support for Microsoft’s .NET Framework, enabling you to interoperate with applications written in other .NET languages, such as C# or VB .NET; to take advantage of all .NET Framework classes; and to develop Synergy code in Visual Studio. At the SPC, you will learn the ins and outs of the new technology, and get the opportunity to try it out for yourself with hands-on exercises. (Note: The focus isn’t on getting to .NET. The focus is on modernizing your application – and .NET just happens to be the best way to do it. I’ll be blogging more about this shortly!)
  • Experience the latest functionality hands-on. It’s one thing to hear about all the new features we’ve added to our products over the years—it’s another to actually try them out with the knowledgeable PSG consultants standing by for questions. The popular Code Phantom is back, offering even bigger, better, and more enjoyable challenges to help you experience the latest functionality first-hand.
  • You want to make sure you are fully taking advantage of Synergy/DE 9.3. Synergy/DE 9.3 delivers a number of important features that enable you to improve security, performance, and productivity. The SPC will cover these features in detail so you can be sure you are making the most of them in your own applications.
  • Networking. SPC attendees often describe the networking opportunities as the most valuable aspect of the conference. Where else will you be among so many other Synergy/DE developers, who may be working on or may have completed projects just like those you are considering or may be struggling with? One of our customers who traveled from Europe told me he justified the entire conference just by one conversation he had with another customer at the welcome reception. The rest of the conference was just icing on the cake. And because this year’s conference is in Sacramento, the Synergex staff members who are developing and supporting your Synergy/DE products will be there to answer your questions.
  • Your business depends on it. I could go through several analogies ranging from maintaining your health, home, investment portfolios, etc., but the bottom line is that your business depends on your Synergy/DE solutions. With that in mind, how could you not take advantage of this once-a-year opportunity to make sure you are taking advantage of all that is available to you—that you are working most efficiently, and that your products are as functional and powerful as the technology allows them to be?

You can find details about the conference at conference.synergex.com. I look forward to seeing you there.


Blunt Pencil?

By Richard Morris, Posted on August 2, 2010 at 10:22 am

Avatar

I guess all jobs have their perks. This week visiting the office in California, in July, is certainly one of mine. OK, so being stuck in an air conditioned office all week is not exactly lapping up the sunshine, but today is Saturday and I’ve been loaned a bike!  Now, when I say bike, let me explain. It’s kind of like a laid back Harley Davidson Fat Boy with thick tires, wide handle bars and a thick padded seat. That’s where the resemblance with a Harley ends I’m afraid. There is no chrome laden chassis, or thumping V-twin engine, just two pedals, powered by my spindly and pale English legs! It’s certainly no wheelie machine either. Actually it’s quite a cool ride, just totally out of place against all the racing style cycles I found myself among while cycling along the great American River towards Folsom Dam. But I mind not. Those spandex and lycra clad enthusiasts take it far too seriously! I was simply out for a great ride through some stunning scenery.  

So, MP3 player blasting in my ears I boldly set off on my adventure. I made sure it was only the guys on the ultrafast slick racing bikes that overtook me, but the trouble was, everyone seemed to be on a slick racing bike, honest! I still can’t explain how the walkers got past me though. I’m sure they were taking short cuts!

Undaunted, I pedelled on. Some of the downhill sections of the cycle track may well have been enjoyed at slightly more than the 15 mph speed limit – no, not me officer! Some of the rather tight corners were taken pedal down (similar concept as knee down, just at a more sedate pace). Big smiles all round.

And then the inspiration came to me! My MP3 player has a very varied collection of tunes from many different decades and genres. I regularly add new CD’s to see if I like what I hear. One new song had a lyric from which my inspiration was born. “Like a pencil, old and blunt, I’m writing in a bolder font”. I thought, "ya, how true!".  There is nothing worse than trying to be artistic with an old blunt pencil that smudges as you scribe. So, is this what I’m trying to do with Toolkit? Recent posts on Synergy-L continue to highlight the challenges we all have with fonts and colours, trying to make our applications look cool, modern and saleable.

So maybe it’s time to sharpen our pencils, utilise the new capabilities in Synergy 9, and begin to incorporate new slick UI’s into our existing applications. This is my focus for SPC2010 in October. I’ll present the tools and techniques we can all use to implement new, modern interfaces within our existing and proven Synergy applications.

My adventure is over now and I’m back in the office. Now, where is my pencil sharpener?  It’s time to give ChronoTrack a new lick of paint.

Not convinced? Then why not rise to this challenge…  Send me all the code and data required to build and run a sample element of your application and, in return, as long as you’re at the conference, I’ll demonstrate your same code running with a modern UI. And did I mention that it doesn’t have to be a toolkit application?

Glossary of terms. Wheelie: The act of raising the front wheel of your bike, in a controlled manner, while propelling your bike forward at high speed. In my case: always unintentional and usually with painful and expensive results! Knee down: The art of skimming your knee close to the tarmac while engaging a perfectly executed cornering manoeuvre. In my case: falling off!


XP: The O’Hare of Computer Network Traffic

By synergexadmin, Posted on July 30, 2010 at 4:57 pm

Avatar

Not long ago, I found myself with a lot of time to kill while sitting in an airport and waiting out a flight delay. The culprit, I was told, was a weather system circling over Chicago. At first this seemed odd, since the plane I was awaiting had neither originated in nor connected through O’Hare.

Chicago’s O’Hare Airport is one of the worst places you can find yourself when trying to make a connection. Whenever I see O’Hare on a flight itinerary, I immediately offer up the prayer: “ORD, have mercy.”

I’d intentionally avoided Chicago in my travel arrangements, so I was a little perturbed that O’Hare was still causing me a headache. I began musing on the ripple effect that it plays on the nation’s air traffic network, and a sudden thought occurred to me: “O’Hare does to the air travel network what XP does to data networks.”

I knew I was being unfair to Chicago, but hey: I was irritated. In reality, O’Hare is a well-oiled machine when compared to what Windows XP can do to a network. But thinking of the nation’s air traffic as a computer network really got my Analogy Engine started. I could see every plane as a network packet carrying bits and bytes of information. I saw traffic control towers as network controllers, and airports as individual computers (or hubs, as the case may be). I saw it all in a whole new light…and then wondered, “What would happen if an airport really handled network traffic in a manner similar to XP?”  It was a chilling thought.

For all of its apparent problems, O’Hare still has a significant networking advantage over the default operations of XP (and Windows 2000 and Server 2003): Selective Acknowledgements. The concept at the heart of SACKS allows the controllers at O’Hare to bring planes in for landings without regard for the order in which they were supposed to arrive.

If you’ve ever found yourself trying to diagnose why an XP user is complaining that “the Internet is slow” – even while everyone else on the network seems to be enjoying good speeds – then you’ve already seen what can happen when SACKS aren’t enabled for TCP communications. In fact, in this day and age, SACKS are so vital that it’s amazing Microsoft has never put out a fix on Windows Update – even as an optional one – that enables Selective Acknowledgements. Instead, they leave it up to systems administrators to manually edit the registry of each user’s machine – if, that is, they’re even aware of the problem or its solution.

I should warn you now that because I’m not interested in being the unwitting cause of a registry foul-up that destroys someone’s computer, I’m not going to tell you what those registry tweaks are. There are plenty of articles that will walk you through the process of enabling Selective Acknowledgement on Windows XP, as well as tips on setting the TCPWindowSize to further enhance performance. If you’re just reading this in hope of finding a quick fix to your XP networking woes, you might want to move along now.

On the other hand, if you’d like to learn a little more about SACKS and why it’s so important, then stick around and read on…

Understanding the importance of Selective Acknowledgments is perhaps easier if you understand what happens when they’re not enabled. Imagine that you’re Xavier Purdydumm (XP to your friends), and you’re the “receiving” traffic controller at Nitwit Sacksless Airport (airport code: NTSX)…

You arrive at work before a single plane has entered your airspace. You then review a list of scheduled arrivals for the day. You note the number of planes you expect today, and the order in which they will land. It looks like a light traffic day – you’re only expecting 25 planes before your shift ends.

You’re now ready, so you send out a notification that traffic operations can now commence. And look here! The first plane is on approach. So far, so good.

Plane One lands without incident, as do planes Two and Three. But oh-oh, now there’s a problem: Plane Five just landed…but where’s Plane Four?

You immediately send out a plane of your own. It’s instructions are to fly to Plane Four’s origination point, and to let its traffic controller know that Plane Four never arrived. In the meantime, Plane Six comes in, lands, and heads to the gate.

Still no plane FOUR. However, regulations are regulations so now you send out yet another plane to again request that plane FOUR be sent over. And aha! A new plane arrives. It’s….

…Plane Seven.

You repeat the process again and again, once for each time a new plane arrives. Finally, after you’ve sent off your 15th plane to request the location of the missing aircraft, plane Four suddenly lands. Turns out that plane Four got diverted when it ran into a storm system over Lake Cisco – the same storm system, as it turns out, that you just sent your last 15 planes flying into.

Well, that’s not your concern. You’re here to count planes coming in. And speaking of which, here comes another. It touches down, rolls past you and you see that it’s plane Five. You shake off a sense of déjà and cross it off your list.

You also cross plane Six off of your list – almost (but not quite) certain you’ve seen it before, too – when something odd happens: plane Four lands and taxis by.

Now how could that have happened? You’ve already crossed it off of your list, but there it is (again), plain as day. Deciding you must have made a mistake, you erase the check marks next to planes Five and Six, since there’s no way they could have landed if plane Four hadn’t landed yet.

And just to prove it: Here come planes Five, Six, and…Four?. Again??!?

By now, you’re completely confused, and the fact that one of your underlings keeps reporting that there are already planes sitting at Gates 4 through 6 is really getting on your nerves. He should clearly see that those planes haven’t been checked of your list, so why is he bugging you? You tell him to take care of it, as you’re very busy counting planes at the moment, so he tells them all to get lost. Taking a bit of initiative, he also shoos away the planes waiting to park at gates 7 through 18, too.

This process repeats itself again and again – a total of five times – before the count continues normally with planes Seven, Eight, Nine, Ten, and so forth. By the time plane Twenty Five successfully touches down and unloads at the gate, you feel that somehow your easy traffic day became a nightmare.

And it was a nightmare – but not just for poor Xavier Purdydumm. It was a grueling day for the traffic controller at the airport from which all twenty-five planes departed, as well as for everyone else using the air traffic network – including the folks that never even passed through NTSX.

Let’s take a quick look at the network traffic as shown in the above scenario, versus how it might have looked if Xavier had been able to work with Selective Acknowledgements:

Protocol Packets Received Total
No SACKS 1-2-3-5-6-7-8-9-10-11-12-13-14-15-16-17-18-4-
5-6-4-5-6-4-5-6-4-5-6-4-5-6-7-8-9-10-11-12-13-14-15-16-17-18-
19-20-21-22-23-24-25
51
SACKS 1-2-3-5-6-7-8-9-10-11-12-13-14-15-16-17-18-4-
19-20-4-21-22-4-23-24-4-25-4
29

 

You’ll notice that the first 18 packets are identical, but after that things start to go awry. With only 25 packets in the total communication, a disruption on the 4th packet caused the network to carry 75% more traffic than would have been necessary had SACKS been enabled. Why?

Under TCP, it’s the duty of the receiver to notify the sender when a packet arrives out-of-order. Unfortunately, SACKS-less communications require that all traffic must arrive in order. And just to make matters worse, the originator of the traffic has its own set of rules to follow – the default of which being to only resend a packet if it has been requested in triplicate.

Now, in the example above, one might say that it was the storm over Lake Cisco that caused the issue, but it’s hard to blame the latency caused by the weather. Sure, it certainly caused the disappearance of the original “plane Four.” It also slowed down the traffic both to and from Nitwit Airport, thus allowing fifteen requests to be sent from NTSX before the originator ever received the first three and re-sent plane Four.

But note that the storm caused an equal number of duplicates to be dispatched, whether the protocol had SACKS enabled or not, so as a “factor” the weather pretty much washes out; everyone has to deal with it.

So while the weather causes things to slow down a bit (latency), the root problem is that SACKS-less communications require the sender to resend packets that have already been received in addition to the lost packet.. It’s bad enough in the tiny scenario shown above…but consider the impact if there had been 1,000 packets sent with multiple packet losses.

As I mentioned before, there’s a fix that allows you to turn on Selective Acknowledgements, but it’s not easy to implement – particularly if you’re a developer with multiple software installations at customer sites. The only way around the problem (and remember that it affects Windows 2000 and Server 2003 as well) is to modify the registry. You may find resistance from your customers when you tell that they’re going to need launch RegEdit and start adding DWORD values on every XP workstation they own.

For those of you who are wondering why SACKS-less networking is even in use on XP, remember that “Selective Acknowledgement” is a networking paradigm that came about long after the NT operating system had been created. Back then, when NT launched, there was no such thing as a “high-speed” internet. Networking technology was designed primarily to deal with LANs, which generally meant low-latency communications and fewer lost packets.

Years later, Windows 2000, Windows XP and Server 2003 were introduced. Everyone would probably agree that they were huge steps forward, but unfortunately they all borrowed heavily from NT networking services. That meant that they also adopted a SACKS-less networking default – even as internet speeds, overall network traffic and latency potentials were skyrocketing.

So the next time the skies are full over Chicago, the clouds are massing above Lake Michigan and you figure you’re either going to be late for dinner or late for your connection, remember Xavier Purdydumm…and thank the ORD for Selective Acknowledgements and the fact that O’Hare, at least, has made an effort to keep up with the times.


Starting Services on Linux

By Steve Ives, Posted on July 24, 2010 at 3:14 pm

Steve Ives

For a while now I’ve been wondering about what the correct way is to start boot time services such as the Synergy License Manager, xfServer and xfServerPlus on Linux systems. A few years ago I managed to “cobble something together” that seemed to work OK, but I had a suspicion that I only had part of the solution. For example, while I could make my services start at boot time, I’m not sure that they were getting stopped in a “graceful” way during system shutdown. I also wondered why my “services” didn’t show up in the graphical service management tools.

My cobbled together solution involved placing some appropriately coded scripts into the /etc/rc.d/init.d folder, then creating symbolic links to those files in the appropriate folders for the run levels that I wanted my services started in, for example /etc/rc.d/rc5.d.

This week, while working on a project on a Linux system, I decided to do some additional research and see if I couldn’t fill in the blanks and get things working properly.

My previous solution, as I mentioned, involved placing an appropriately coded script in the /etc/rc.d/init.d folder. Turns out that part of my previous solution was correct. For the purposes of demonstration, I’ll use the Synergy License Manager as an example; my script to start, stop, restart and determine the status of License Manager looked like this:

#

# synlm – Start and stop Synergy License Manager

#

. /home/synergy/931b/setsde

case "$1" in

   

start)

        echo -n "Starting Synergy License Manager"

       

synd

       

;;

   

stop)

       

echo -n "Stopping Synergy License Manager"

       

synd –q

       

;;

   

restart)

       

$0 stop

       

$0 start

       

;;

   

status)

       

if ps ax | grep -v grep | grep -v rsynd | grep synd > /dev/null

       

then

           

echo "License Manager is running (pid is `pidof synd`)"

       

else

           

echo "License Manager is NOT running"

        fi

       

;;

   

*)

       

echo $"Usage: synlm {start|stop|restart|status}"

       

exit 1

esac

exit 0

If you have ever done any work with UNIX shell scripts then this code should be pretty self explanatory. The script accepts a single parameter of start, stop, restart or status, and takes appropriate action. The script conforms to the requirements of the old UNIX System V init subsystem, and if placed in an appropriate location will be called by init as the system runlevel changes. As mentioned earlier, I had found that if I wanted the “service” to start, for example when the system went to runlevel5, I could create a symbolic link to the script in the /etc/rc.d/rc5.d folder, like this:

ln –s /etc/rc.d/init.d/synlm /etc/rc.d/rc5.d/S98synlm

Init seems to process files in a run level folder alphabetically, and the existing scripts in the folder all seemed to start with S followed by a two digit number. So I chose the S98 prefix to ensure that License Manager would be started late in the system boot sequence.

This approach seemed to work pretty well, but it was kind of a pain having to create all those symbolic links … after all, on most UNIX and LINUX systems, run levels 2, 3, 4 and 5 are all multi-user states, and probably required License Manager to be started.

Then, almost by accident, I stumbled across a command called chkconfig. Apparently this command is used to register services (or more accurately init scripts) to be executed at various run levels. PERFECT … I thought! I tried it out:

# chkconfig –-level 2345 synlm on

service synlm does not support chkconfig

Oh! … back to Google… Turns out I was something really critical in my script, and believe it or not, what I was missing was a bunch of comments! After doing a little more research I added these lines towards the top of the script:

# chkconfig: 2345 98 20

# description: Synergy/DE License Manager

# processname: synd

Low and behold, this was the missing piece of the puzzle! Comments … you gotta love UNIX! So now all I have to do to start License Manager at boot time, and stop it at system shutdown is use the chkconfig command to “register” the service.

And there’s more … With License Manager registered as a proper service, you can also use the service command to manipulate it. For example, to manually stop the service you can use the command:

# service synlm stop

And of course you can also use similar commands to start, restart, or find the status of the service. Basically, whatever operations are supported by the init script that you provide.

Oh, by the way, because License Manager is now running as a proper service it also shows up in the graphical management tools, and can be manipulated by those tools … very cool!

Of course License Manager is just one of several Synergy services that you could use this same technique with. There’s also xfServer, xfServerPlus and the SQL OpenNet server.


Visual Studio 2008 SP1 Hangs After Office Upgrade

By Steve Ives, Posted on July 22, 2010 at 5:55 pm

Steve Ives

Just incase you run into the same issue…

This week I had to revert back to using Visual Studio 2008 while working on a customer project, and I pretty quickly found that I had a problem. I was working on an ASP.NET web project, and found that each time I opened a web page for editing, Visual Studio would appear to hang. Clicking anywhere on the Visual Studio window resulted in the ubiquitous Windows “beep” sound.

On researching the problem in the “Universal Documentation System” (Google) I quickly found that I was not alone in my frustrations … in fact it seems like this is a common issue right now.

Turns out that the problem is related to the fact that I recently updated from Office 2007 to Office 2010. I guess Visual Studio 2008 uses some components from Office 2007 when editing HTML and ASPX pages, and I guess that component got screwed up by the Office 2010 upgrade. If you encounter this problem you will likely find that when Visual Studio 2008 hangs it has started a SETUP.EXE process, but that process never seems to complete. Apparently it’s attempting to do a repair of the “Microsoft Visual Studio Web Authoring Component”, but for some reason can’t.

The solution seems to be to manually run the setup program and select “Repair”. On my system the setup program was C:Program Files (x86)Common Filesmicrosoft sharedOFFICE12Office Setup ControllerSetup.exe. My system runs a 64-bit O/S … if you’re using a 32-bit O/S you’ll presumably just need to drop the (x86) part.

The repair took about two or three minutes, and low and behold I have my Visual Studio 2008 installation working just fine again!


Linux ls Color Coding

By Steve Ives, Posted on July 20, 2010 at 4:32 pm

Steve Ives

It’s always driven me CRAZY the way that RedHat, Fedora, and presumably other Linux systems apply color coding to various types of files and directories in the output of the ls command. It wouldn’t be so bad, but it seems like the default colors for various file types and protection modes are just totally unreadable … for example black on dark green doesn’t show up that well!.

Well, today I finally got around to figuring out how to fix it … my preference being to just turn the feature off. Turns out it was pretty easy to do, open a terminal, su to root, and edit /etc/DIR_COLORS. Towards the top of the file there is a command that was set to COLOR tty, and to disable the colorization all I had to do was change it to COLOR none. Problem solved!

Of course if you look further down in the file you’ll see that there are all kinds of settings for the color palettes to be used for various file types, file protection modes, etc. You could spend time “refining” the colors that are used … but personally I’m happier with the feature just GONE!


Winner Winner Chicken Dinner!

By Don Fillion, Posted on July 12, 2010 at 12:45 pm

Avatar

Spain isn’t the only winner…
Congratulations to Graeme Harris of G. Harris Software—the lucky winner of the Official World Cup Jabulani ball!
Graeme, your name was picked in the random drawing of all PSG blog subscribers who entered to win. Your official World Cup soccer ball is already en route to your doorstep. (Let's hope it flies true and doesn't sail over the goal…)
Thanks to everyone who subscribed and participated. We hope that you continue to read and enjoy the blog!


Learning from Stupidity

By synergexadmin, Posted on July 9, 2010 at 5:56 pm

Avatar

I'll be the first to admit that I've done some really stupid things in my life.

Like the time I decided to paddle a canoe across a mile-wide river even as threatening clouds loomed on the horizon.  Or the time I got stuck on a hike by taking a "short-cut" which involved shimmying around an overhang, leaving me suspended over a 70-foot drop to some very sharp, very hard rocks below.

Then there was the time I remoted into a production Alpha server and decided to shut down TCP/IP for a few seconds. Now that was fun; I figured my first month on the job was also going to be my last.

But all of these dumb moves — and many others — have at least one thing in common: Though I learned something, everything was over so quickly that I never had much time to worry about the repercussions.

Not so, yesterday.

My laptop had been causing me more and more grief lately, so I decided it was time to re-install the OS and start from scratch. I wasn't in a huge rush, however, and I had other things to do anyways. So it took me several days to complete my preparations for the wipe, during which time I methodically moved files from my laptop to a backup device.

Yesterday, before lunch, I declared my system ready for reinstall, and pulled the proverbial trigger. I reformatted the hard drive, installed a clean copy of Windows 7, and then ran Windows Update to get everything up to snuff. Success! I was really rocking, and I realized that if I hurried, I could get a full backup of my "clean" build before I left for lunch. So of course, I did something incredibly, unbelievably stupid.

Lesson #1: Do NOT destroy a completely valid, disk image backup to make room for a "fresh" disk image backup.

Turns out that my backup device — a 300GB external drive — was getting a little full. I'd been faithfully (more or less) doing disk image backups for quite a while, with the most recent being dated just last Friday. But those files were just SO BIG and I really needed the space for a new backup set.

My rationalization was pretty solid: I'd backed up copies of only those files that I needed, they were all organized well, and I had ISO images of all the programs I was going to need to re-install, so what's the point in keeping a backup of a system I'm never going to use again anyways?

Plus, I really needed the space.

So I deleted the disk image backup, started a new one from scratch, and went to lunch. Upon returning, the backup was complete. Moving right along, I quickly copied my well-organized backup files into place, and started with software installations.

Someone upstairs was watching out for me, however, because the first software I re-installed was a tiny little program that allowed me to access very special, very important and very irreplaceable encrypted files. And though it installed without a hitch, I quickly found that the encrypted files it opens…

…weren't there.

They weren't in the folders I'd copied back to my laptop, and they weren't on the backup drive. I searched network drives, network computers, and even checked a USB flash drive just against the chance that I'd momentarily lost my mind, transferred them there, and then forgotten about it. Perhaps the worst problem was that I had specifically made sure that those files had been backed up two or three days ago, and I knew everything was ok.

Hadn't I?

I finally gave up on locating them the "easy" way, and started downloading software that scanned hard disks to recover deleted files. After trying five different freebie versions, each of which were dismal failures, I'd almost given up hope. So just before midnight, I gave in and downloaded a try-before-you-buy piece of software called File Scavenger.

The demo version offers the ability to scan a hard drive and locate darn near everything that was ever on it and not overwritten, but only lets you recover 64K of a file before it asks you to pay. Knowing I'd happily pay the $49 if it worked, I downloaded and installed it. Upon running it, however, it looked as if it was going to take at least a couple of hours to scan whatever was left of my hard drive after the format/reinstall, so I decided to retire for the night and get some sleep.

Lesson #2: You can't sleep when you've probably just lost something that's irreplaceable.  (It's the Not Knowing and the What-If's that will keep snapping you back to full consciousness…again and again and again.)

Early this morning, I was back at my desk, with the knowledge that if the scan that I had left running was going to find something, it would have done so by now. I watched the ribbons bounce around on my laptop's monitor. I probably stared at them for a full minute before steeling myself for the worst, taking a last sip of coffee, and moving the mouse to break the spell of the screen saver.

There they were. I almost couldn't believe it. All three of the large, encrypted files that contained countless (or at least, well in excess of 150,000) other files.

Lesson #3: When pulled from a wallet too quickly, a credit card can cut through the pocket that holds it. Sometimes, it's safer for your wallet — and easier on you — to try the Pay-For "premium" solution before you waste hours hunting down the free alternative.

It was the fastest online purchase I've ever made. And within 30 minutes, I'd recovered all three files and confirmed that they were intact and had all of their information. I'd also backed them up (again) and re-confirmed their existence on the backup drive. I then put them on the hard drive of two other computers. Gun-shy? Absolutely.

But I've got to say, this software is amazing — and not just a little scary, too. While doing my scans of my laptop's hard drive, I found a lot of stuff that shouldn't be there. Like stuff that's been deleted for years. Doing a scan of my backup drive, a networked drive personal drive we use to keep copies of pictures, music, movies and (eek!) bank account information, and my little USB flash drive, I found lots and lots and lots of stuff that simply shouldn't be there.

Lesson #4 (Unintended): Deleted isn't deleted until you make it so.

Turns out that NTFS is really, really good at keeping your files intact even after they've been deleted — or even subjected to a quick re-format. FAT32 is as fragile as crystal by comparison, but still has the potential to leave a file intact long after you've deleted it. And while most everyone who's reading this already knows to use a disk utility to overwrite "unused" disk space before getting rid of a drive, remember that until you do, the data is likely still there.

And by the by…did you know that most printers and copiers have hard drives in them? Think twice before you donate or sell them, because the person that takes them off of your hands may have File Scavenger (or something similar) in their possession! With what I've learned — and now purchased — it brings a whole new world of (shady) opportunities to the table. For instance, my neighbor down the street actually has a bunch of printers sitting in his yard under a big sign that says "Take Me I'm Free" (no kidding). It's suddenly tempting to pick them up and take a little peek inside, but fortunately (for both of us) I don't have the time right now, as I'm heading out the door on holiday in only a few short hours.

Now, if only I could just learn

Lesson #5: Don't post a blog entry about your stupidity where the director of your department is likely to read about it

I could be reasonably sure that my job is still secure when I return from a well-needed vacation…

And yes: I'm going to Disneyland.


“Applemania” and iPhone 4

By Steve Ives, Posted on July 1, 2010 at 1:22 pm

Steve Ives

So I finally did what I said I would never do … I set out from home, in the wee hours of the morning, to stand in line for hours in order to buy something!  The venue? … my local AT&T store. The event? … the first in store availability of iPhone 4. In my lifetime I have never done this before, but I figured … what the heck! I grabbed my iPad for a little entertainment while in line and headed out around 3am.

I stopped off at the 24 hour Starbucks drive through on the way there and stocked up with a large black coffee and a sandwich, and by 3.15am I had staked my place in line. I was surprised that there were only around 20 people ahead of me in line, I was expecting more. Apparently the first guy in the line had been there since 7pm the previous evening … a full twelve hours before the store was due to open at 7am!

So how was the wait? Actually it was kind of fun. There were all kinds of people in line, from technology geeks like me, to teens with too much money on their hands, to families, and even a few retired seniors. Everyone was chatting away, and the time passed pretty quickly. It was a beautiful night out, not too cold, not too hot, and before we knew it the sun was rising at 5.30am. By this time the line had grown considerably longer, and by the time the store opened at 7 had probably grown to two or three hundred people! I remember thinking to myself that if the same thing was being repeated at every AT&T store in the country then there were a LOT of people standing in line.

Opening hour arrived and within a few minutes I was walking out of the store with my new phone and heading off for a day in the office.

So … was the new iPhone worth the wait? … ABSOLUTELY! I’ve been using an iPhone 3G for quite a while now and I was already in love with the thing. I’d skipped the whole 3GS iteration of the device, so the differences between my old phone and my new one was … staggering!

The new Retina display, with a resolution of 960 x 640 (vs. the 480 x 320 of earlier models) means that there are four times the number of pixels packed into the same amount of screen real estate. This translates to a screen which looks fabulous, and photos and videos which look considerably better.

Speaking of photos and videos, the upgrade to a 5MP camera and the addition of an LED flash finally make it possible to take reasonably good pictures and videos with an iPhone. There is also a new camera on the front of the phone; it is a much lower resolution (only VGA in fact) but actually that's perfect if you want to take a quick photo, or record a short video and then email it out to someone (especially if you on the new 200MB data plan … but more about that later).

The iPhone 4, like the iPad, uses Apples new proprietary A4 (computer on a chip) silicone, and I must say, the performance gain does seem to be considerable, even compared to the more recent 3GS models. Another benefit of this is that, despite the fact that the new device is smaller than pervious iPhones, there is more room inside for a bigger battery! This is great news, because battery endurance has never been one of iPhones strong points to date.

Of course one of the coolest new features is FaceTime … video calling between iPhone 4 users. I haven’t had a chance to try this out yet, but I’m looking forward to doing so soon. Apparently FaceTime only works over Wi-Fi networks, which is probably a good thing both from a performance point of view, and also potentially a cost point of view … which bring be to the subject of data plans.

In the past, in the US at least with AT&T, all iPhone users had to cough up $30/month for their data plan, and in return were able to use an unlimited amount of data. This was great because it meant that you could happily use your shiny iPhone to the full extent of its considerable capabilities and not have to worry about how much bandwidth you were actually consuming. But now … things have changed!

New iPhone customers now have a choice of two data plans. There is a $15/month plan which allows for 200MB of data transfer, or a $25 plan providing 2GB. AT&T claim that the 200MB plan will cover the data requirements of 65% of all iPhone users, which may or may not be true. Even if you opt for the more expensive 2GB plan you still have a cap, and may need to be careful. Personally I don’t think I’d be very happy on the 200MB plan, mainly because of things outside my control, like email attachments, which iPhone users don’t really have any control of.

I have been trying to find out what happens when you reach your monthly limit, but so far without success. One AT&T employee told me that on reaching your data limit the account will simply be changed for another “block” of data, without any requirement for the user to “opt in”. Another AT&T employee told me essentially the opposite; that network access would be suspended until the user opts in to purchase more data (similar to the way the iPad works). What I do know is that as you draw close to your limit you should receive free text messages (three I believe, at various stages) warning you of the issue. All I can suggest right now is … watch out for those text messages!

For existing iPhone customers, the good news is that your existing unlimited plan will be “grandfathered in” at the same rate that you currently pay, so we can all continue to consume as much bandwidth as we like and not worry too much about it!

Apple seems to have done a pretty nice job with the implementation of the recently introduced iOS 4. The platform finally has multi-tasking capabilities, which some may not immediately appreciate the benefit of, but it just makes the whole user experience so much more streamlined.  Also the new folders feature makes it easy to organize your apps logically without having to flip through endless screens of icons. Pair the advances in the operating system with the significant advances in the hardware of the new device and the overall impact is really quite significant.

Overall, I think Apple did a good job with the iPhone 4, but there are a couple of things I don't like. The main one is … well, with its "squared off" edges … the new device just doesn't feel as good in your hand as the older models. Also, no doubt you'll have heard all the hype about lost signal strength if the device is held in a certain way … well, I must say that it seems like there could be something too that. Unfortunately, when using the device for anything other than making a call, I reckon that most people hold the phone in a way that causes the problem! Of course Apple has offered two solutions to the problem … 1) don't hold the device that way … and 2) purchase a case!

But on balance I think the upgrade was worth it. There are so many cool new things about iPhone 4 but I’m not going to go into more detail here … there are hundreds of other blogs going into minute detail about all the features, and if you want to find out more a good place to start is http://www.apple.com/iphone/features.


Windows Live SkyDrive

By Steve Ives, Posted on June 28, 2010 at 5:13 pm

Steve Ives

Have you ever wished there was an easy way to view and edit your documents on different computers, in different locations, in fact … from anywhere, and without having to carry USB thumb drives, or log in to a VPN. Well there is, and it’s free.

For some time now Microsoft have offered a free service called Office Live Workspace, (http://workspace.officelive.com) which went part of the way to solving the problem. Office Live Workspace essentially provides 5GB of free on-line storage, and a web-based portal, which allows you to upload, manage and view your files. It’s primarily designed to deal with Microsoft Office files, although other files can be stored there also.

Office Live Workspace worked pretty well, but it did have some restrictions, which meant that the experience was somewhat less than optimal. For example, when viewing a document it would be converted to an HTML representation of the actual document and displayed in the browser. You do have the option to edit the document of course, but doing so required you to have a recent copy of Microsoft Office installed on the computer that you were using. This is probably fine if you are using your own system, but was likely a problem if you were using a public computer in a hotel or an airline lounge.

On the positive side, if you did happen to be working on a system with a recent copy of Microsoft Office, and had the Windows Live Workspace extensions installed, it was possible to interact with your on-line storage directly from within the Office applications, similar to the way that you work with files on a SharePoint server, and this worked really well.

So, using Office Live Workspace from within Microsoft Office was a good experience, and at least you could get to, view and download your files from any Internet browser.

There is also another interesting product called Windows Live Sync, which kind of approaches the problem from another angle. Sync allows you to synchronize the files in one or more shared folders with one or more other computers. If you add a file on one computer it is replicated, pretty much instantly, to the other computers that “subscribe” to the shared folder. This is a very different approach, because although your documents clearly flow over the network (securely of course), they don’t get stored on network servers. So this is a great solution if you want to be able to edit a document at home, and have it magically appear in a folder at work so you can work on it the next day. But there is no access to the files via a web browser on some other computer.

Enter Windows Live SkyDrive (http://windowslive.com/online/skydrive), which seems to combine the concepts of both Office Live Workspace and also Windows Live Sync … and then adds even more.

SkyDrive is a free service providing 25GB of on-line storage. Like Office Live Workspace it has a web-based UI, which allows files to be uploaded, viewed, downloaded, etc. It is also possible, of course, to edit your files directly using your local Microsoft Office applications. So far so good … so what’s different?

Well, perhaps the main different is that as well as allowing documents to be viewed in your web browser, SkyDrive also integrates Microsoft’s new Office Web applications. So, not only can you edit your Word Documents, Excel Spreadsheets and PowerPoint presentations locally, you can also do so directly in the web browser! You can even create new Office documents directly on the server in the same way.

Of course the new Office Web applications are somewhat cut-down versions of their desktop counterparts, in fact they only have a fraction of the capabilities of the full products, but never the less they are very usable, and allow you to do most of routine editing tasks that you likely need to for day to day work on your documents. Remember, this is all for free – pretty cool!

But there’s more … SkyDrive also provides Sync capabilities also. Not for the full 25BG of on-line storage, but there is also a 2GB “bucket” that you can use to setup synchronization of documents between computers … the difference is that the documents are also available on the SkyDrive. So now you can edit your documents locally at home, or at work … on your own computers, but still have access to them via a web interface when away from your own systems. Unfortunately the Office Web apps can’t be used on these synchronized files (hopefully that will change at some point), but you do have access to them from any browser.

By default everything that you upload or Sync through any of these products can only be accessed via your own Windows Live login … but you can setup shares and give others access to all or part of your storage too. And there is specific support for creating shared on-line photo albums too.

Oh, I almost forgot, if like me you use a combination of Windows and Mac computers then all of these products work just great on Mac too. In fact, personally I think the Office Live Workspace experience is actually better on the Mac than the PC! I have just finished testing SkyDrive on the Mac too, including Sync, and it works wonderfully well.

SkyDrive is currently a beta service, but is in the process of transitioning to full production use about now. I’ve been playing with it for a little while now, and it seems to work extremely well. Check it out.


Search Engine Optimization (SEO)

By Steve Ives, Posted on June 14, 2010 at 7:08 pm

Steve Ives

I’ve been writing web applications for years, but I’ve never really had to put too much thought into whether search engines such as Google and Bing were finding the sites that I have worked on, or whether they were deducing appropriate information about those sites and giving them their appropriate ranking in search results. The reason for this is that most of the web development that I have been involved with has tended to be web “applications”, where the bulk of the interesting stuff is hidden away behind a login; without logging in there’s not much to look at, and in many cases the content of the site isn’t something you would want search engines to look at anyway … so who cares about SEO!

However, if you have a web site that promotes your company, products and services then you probably do care about SEO. Or if you don’t, you probably should! Improving your ranking with the search engines could have a positive impact on your overall business, and in these economic times we all need all the help we can get.

Turns out that the basic principles of SEO are pretty straight forward, really it’s mainly about making sure that your site conforms to certain standards, and doesn’t contain errors. Sounds simple right? You’d be surprised how few companies pay little, if any, attention to this potentially important subject, and suffer the consequences of doing so … probably without even realizing it.

You have little or no control over when search engine robots visit your site, so all you can do is try to ensure that everything that you publish on the web is up to scratch. Here are a few things that you can do to help improve your search engine ratings:

  • Ensure that the HTML that makes up your site is properly formatted. Robots take a dim view of improperly formatted HTML, and the more errors that are found, the lower your ratings are likely to be.
  • Don’t assume that HTML editors will always produce properly formatted HTML, because it’s not always the case!
  • Try to limit the physical size of each page. Robots have limits regarding the physical amount of data that they will search on any given page. After reading a certain amount of data from a page a robot may simply give up, and if there is important information at the bottom of a large page, it may never get indexed. Unfortunately these limits may be different from robot to robot, and are not published.
  • Ensure that every page has a title specified with the <TITLE> tag, and that the title is short and descriptive. Page titles are very important to search engine robots.
  • Use HTML headings carefully. Robots typically place a lot of importance on HTML heading tags, because it is assumed that the headings will give a good overall description of what the page is about. It is recommended that a page only has a single <H1> tag, and doesn’t make frequent use of subheadings (<H2>, <H3> etc.).
  • Use meta tags in each page. In particular use the meta keywords and meta description tags to describe what the page content is about, but also consider adding other meta tags like meta author and meta copyright. Search engine robots place high importance to the data in meta tags.
  • Don’t get too deep! Search engines have (undocumented) rules about how many levels deep they will go when indexing a site. If you have important content that is buried several levels down in your site it may never get indexed.
  • Avoid having multiple URL’s that point to the same content, especially if you have external links in to your site. How many external links point to your content is an important indicator of how relevant your site is considered to be by other sites, and having multiple URL’s pointing to the same content could dilute the search engine crawlers view of how relevant your content is to others.
  • Be careful how much use is made of technologies like Flash and Silverlight. If a site’s UI is comprised entirely of pages which make heavy use of these technologies then there will be lots of <OBJECT> tags in the site that point the browser to the Flash or Silverlight content, but mot much else! Robots don’t look at <OBJECT> tags, there’s no point because they would not know what do with the binary content anyway, so if you’re not careful you can create a very rich site that looks great in a browser … but has absolutely no content that a search engine robot can index!
  • If your pages do make a lot of use of technologies like Flash and Silverlight, consider using a <NOSCRIPT> tag to add content for search engine robots to index. The <NOSCRIPT> tag is used to hold content to display in browsers that don’t support JavaScript, but these days pretty much all browsers do. However, search engine robots DO NOT support JavaScript, so they WILL see the content in a <NOSCRIPT> section of a page!
  • Related to the previous item, avoid having content that is only available via the execution of JavaScript – the robots won’t execute any JavaScript code, so your valuable content may be hidden.
  • Try to get other web sites, particularly “popular” web sites, to have links to your content. Search engine robots consider inbound links to your site as a good indicator of the relevance and popularity of your content, and links from sites which themselves have high ratings are considered even more important.
  • Tell search engine robots what NOT to look at. If you have content that should not be indexed, for any reason, you can create a special file called robots.txt in the root folder of your site, and you can specify rules for what should be ignored by robots. In particular, make sure you exclude any binary content (images, videos, documents, PDF files, etc.) because these things are relatively large and may cause a robot to give up indexing your entire site! For more information about the robots.txt file refer to http://www.robotstxt.org.
  • Tell search engines what content they SHOULD look at by adding a sitemap.xml file to the root folder of your site. A sitemap.xml file contains information about the pages that you DO want search engine robots to process. For more information refer to http://www.sitemaps.org.
  • Ensure that you don’t host ANY malware on your site. Search engine robots are getting pretty good at identifying malware, and if they detect malware hosted on your site they are likely to not only give up processing the site, but also blacklist the site and never return.

Getting to grips with all of these things can be a real challenge, especially on larger sites, but there are tools out there to help. In particular I recently saw a demo of a new free tool from Microsoft called the SEO Toolkit. This is a simple application that you can use to analyze a web site in a similar way to the way that search engine robots look at a site, and the tool then produces detailed reports and suggestions as to what can be done to improve the SEO ratings for the site. You can also use the tool to compare changes over time, so you can whether changes you make to the site have improved or worsened your likely SEO rating. For more information refer to http://www.microsoft.com/web/spotlight/seo.aspx.

This article only scratches the surface of what is an extensive and complex subject, but hopefully armed with the basics you can at least be aware of some of the basic rules, and start to improve the ratings for your site.


Another TechEd Sticky Note

By synergexadmin, Posted on June 10, 2010 at 11:58 pm

Avatar

The other night, I discovered the way to beat the heat while here in New Orleans. It’s a fruity little concoction known as the Hurricane, and while it doesn’t actually affect the climate around you, it sure makes feeling hot and sticky a lot more enjoyable. I’m also pretty sure how it got its namesake: in the morning, you find yourself trying to reconstruct the previous 12 hours of your life by putting together the pieces and fragments of your memory.

TechEd 2010 draws to a close this evening, and though it’s been increasingly difficult to find sessions that seem pertinent to we Synergexians, it’s still been a worthwhile experience.
I’ve learned a lot just by watching presenters step through the build of a Silverlight UI using Microsoft Expression, or show off the latest features of Visual Studio 2010 and how it can be used to quickly create a web app, or walk through the use of new simplified Windows Communication Foundation 4 features.I’ve even filled in the holes in my schedule with sessions on interesting (to me) topics, such as IPv6, trends in cybercrime, and hacker techniques.

Which all brings me to the point of this little blog entry: It seems to me that the value of conferences lies not in the number of sessions that directly apply to you, but in the quantity and quality of the little tidbits you pick up each day. It’s in the discussions you have with other developers and like-minded individuals – whether they take place while sitting down over a cup of coffee, or simply during a quick ride in the elevator. It’s in the creative ideas that spring up when you see a clever implementation and wonder if you can apply the same techniques to an unrelated solution of your own. It’s in the tips, tricks and techniques that you pick up, which will not only save you hours, days, and even weeks of effort in the year ahead, but which can also be shared with the rest of your team to make them more productive as well.

Just a sales pitch for SPC2010? Perhaps…but that wasn't the intent. After all, this is my blog, and with it I get to share helpful experiences from my time “out in the field.” If writing about it all means I’ll get to see more of you when we set up shop in October at the Citizen Hotel, then so much the better. But in the end, my little revelation about the value of coming to TechEd – even with so much focus on technologies that I can’t use – is helping me to sit back and enjoy this final day of the conference, secure in the knowledge that I’m going to be learning something interesting at every turn. And isn’t that what attending the conference is all about?

That, and the Hurricanes, of course…


Preparing for Windows Phone 7

By Steve Ives, Posted on at 7:59 pm

Steve Ives

By Steve Ives, Senior Consultant, Synergex Professional Services Group

Later this year, probably, Microsoft are releasing a new version of their phone operating system, and it’s going to be a BIG change for developers who have created applications for the earlier Windows Mobile operating systems. The new O/S is called “Windows Phone 7”, and although under the covers it’s really still Windows CE, on the surface things will look VERY different.

Perhaps the largest single change will be the user interface of Windows Phone 7 devices, which will be entirely driven by Microsoft Silverlight. That’s potentially great news for existing Silverlight or WPF developers, but will of course mean a total re-write of the UI for developers with existing applications which were essentially based on a subset of Windows Forms.

Using Silverlight will mean that we can expect some dazzling UI from applications, and indeed the O/S and the standard applications provided with it already look pretty cool, but there will definitely be a learning curve for anyone who has not developed Silverlight applications before.

Part of the good news is that the basic tools that you need to develop Windows Phone 7 applications are free. You can download Visual Studio Express Phone Edition and have pretty much what you need to develop applications. At the time of writing though, these tools are in a “pre-beta” form, and as such you can probably expect some issues, and need to update the tools pretty regularly.

There is, in my humble opinion at least, also some bad news, not least of which is that Microsoft seem to have turned the Windows Phone platform into, essentially, another iPhone! While developers can use free development tools (or full versions of Visual Studio) to create their applications (just like with the iPhone) they will have to sign up for a $99 annual “Windows Phone Developer “subscription in order to have the ability to deploy their application to their physical phone for testing (just like with the iPhone).

It will no longer be possible to deploy applications via “CAB file” installations, in fact for anything other than developer testing, the ONLY way to get an application onto a Windows 7 Phone will be via the Microsoft “Windows Phone Marketplace” (just like with the iPhone). When a developer publishes an application to the marketplace they can chose whether the application is free, or is to be charged for. With iPhone development developers can submit an unlimited number of free applications, and many do. With Windows Phone 7, developers can only submit five free applications, and after that there will be a charge to submit further free applications. If an application is submitted for sale, Microsoft will take a 30% cut of any proceeds (just like with the iPhone).

Applications submitted for inclusion in the marketplace will be subject to “testing and approval” by Microsoft (just like iPhone apps), and apps may be rejected if they don’t meet the guidelines set by Microsoft (just like with iPhone apps). This inevitably means that some types of applications won’t be allowed. For example, with the iPhone it is not possible (in the US at least) to use “tethering” to enable you to plug your iPhone into your laptop in order to access the Internet via the cell phone network, and I would imagine we’re now going to see similar restrictions on Windows 7 Phone applications.

iPhone applications execute in a very strictly defined sandbox, and while this does afford a lot of protection for the platform (because, for example, one application can in no way interact with the data of another application), it can also seriously limit what applications can do. For example, on the iPhone it is not possible to save an email attachment (say a PDF file) and subsequently open that PDF file in another application, Acrobat Reader for example. While I understand the protections offered by the sandbox approach, as a user of the device I feel that it restricts too far what I can do with the device. The Windows Phone 7 platform is essentially exactly the same.

Other restrictions in the Windows Phone 7 platform that developers will have to come to terms with are:

  • No access to TCP/IP sockets
  • No access to Bluetooth communication
  • No access to USB connections to a host computer
  • No Windows Forms UI’s
  • No SQL Express access
  • No ability to execute native code via pinvoke (except device drivers, which must be approved by Microsoft)
  • No customization of O/S features (e.g. no alternate phone dialers)

One thing that strikes me as kind of strange is that, apparently, the web browser on Windows Phone 7 will not support Flash, and apparently will not support Silverlight either! The flash thing is kind of expected, both Apple and Microsoft seem to do everything they can to keep Flash of THEIR devices, but not supporting Silverlight (on an O/S where the entire UI is Silverlight) was a surprise … at first. Then I realized that if the browser supported Silverlight there would be a way for developers to circumvent all of the application approval and marketplace restrictions that I talked about earlier!

Another surprise was that, like all versions of the iPhone until iOS 4.0, Windows Phone 7 devices will only execute a single user application at a time. This is one of the main things that iPhone users have complained about through the versions, and Apple just learned the lesson, but it seems that Microsoft have decided not to. For developers this means that it is imperative that applications save their data and state frequently, because the application could be terminated (with notification and the ability to clean up of course) at any time.

One thing is for sure … Microsoft seem to be betting the company on “The Cloud”, and Windows Phone 7 falls straight into this larger scale objective. The vision is that this new device will be a gateway to The Cloud in the palm of your hand. It is expected that many applications may execute directly from The Cloud (rather than being installed locally on the device) and that the device will have the ability to store (and synchronize) data in The Cloud. Apparently these features will be included for free, with a limited (not announced) amount of on-line storage, and presumably fee-based options for increasing the amount of storage available. Of course using things in The Cloud is all well and good, until youfind yourself in a "roaming" situation, paying $20/MB, or more!

On the bright side, Windows Phone 7 devices will be available from a number of different manufacturers, so there will be choice and competition in the marketplace. Windows Phone 7 devices will (in the US at least) be available from a number of cell phone carriers, unlike Apples exclusive deal with AT&T.

While there is no doubt that Windows Phone 7 promises to be a seriously cool new device, and I have no doubt will sell in larger numbers than any of the predecessor Windows Mobile devices ever did, it remains to be seen whether it will have what it takes to be a serious competitor to the mighty iPhone. I can’t help wishing that Microsoft had done at least some things a little bit differently.


Don't miss a post!

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Recent Posts Tag Cloud Archives