Search This Blog

Saturday, February 27, 2010

Are you 'Visual'?

If you're like a lot people you learn more by observing how a process or task is completed than by having it explained or reading a book. While manuals and instructions are important, having hands-on experience with something reinforces the training being received and really 'drives it home'. I am one of the visual types myself. I like to work with a new process or product, dig into it, see what it's made of, try to push it to its limit and see what happens. This past week I had a wonderful time in Newbury, England with the Micro Focus Development Teams kicking the tires on Visual COBOL.

"Visual COBOL"?!?!? Yep...Visual COBOL. Now this isn't some marketing campaign kicked off by the Micro Focus Execs to try and sell more re-branded, same old run of the mill compilers (trust me, this ain't that at all). This is a whole new product that was developed to take advantage of the Microsoft .NET environment and platform. Before I explain about Visual COBOL I need to fill in some details. On or about April 12th Microsoft is going to be releasing Visual Studio 2010. This is going to be a major release for Microsoft, along with several other products. There will also be a new version of the .NET Framework being released. As part of the Global Agreement between Microsoft and Micro Focus, Micro Focus is required to release a product that will take advantage of the new architecture and features of the Visual Studio 2010 product... and do it within 30 days of the VS2010 launch date. Enter Visual COBOL.

Visual COBOL is 'not the same old COBOL' from Micro Focus. Not even close. Oh there are similarities; like a 30 plus year history of COBOL development experience, a lineage that has spanned multiple platforms, multiple language revisions, and dialects and has had more developers banging away on it than most other language development environments. But that's where the similarity ends. Visual COBOL is a whole new development product. The folks in Newbury took the knowledge and experience they have, married it with the excellent technology of .NET and created a whole new compiler and runtime environment. As some people have said in the past about COBOLs' growth..."this ain't your Daddy's COBOL" and they are so right. Now don't assume just because it's a "whole new development product" that your old COBOL code won't work on it...wrong answer Skippy! Although the development product is brand new, Micro Focus has taken great care to ensure backward compatibility. To that end, back to this week!

The week in Newbury was designed to bring a group of people from outside Newbury, along with the Newbury and Sophia Development Teams together to really kick the tires on Visual COBOL. The goal was to see where the product was in relation to the project timeline, what issues were noted, and what could be resolved before the go live date. Everyone brought along samples of code and the first thing everyone did was try out their code on the compiler. I can attest that 100% of my previous code compiled and executed. The Team from Japan was also very impressed that not only did their code execute, but the interface was accurate with very few issues noted. The Team from France had similar results. The major portion of the week though was devoted to a process called 'STX' or Super Test eXtended. STX was a three day programming marathon session by everyone involved. All development on Visual COBOL was halted. The Development teams were split into teams of two and each was to come up with a project using Visual COBOL. They were supposed to use Visual COBOL as you or I would, as an end-user. All coding was to be done in COBOL, only COBOL. This was the first time I participated in an exercise such as this. To put it mildly, I felt like a fish out of water. While I like to think I'm a good coder I felt like a kid out of elementary school around these people. The way their minds think and the things they came up with were amazing. And their ability to go from concept to prototype in a very short period of time was awesome. The end result...WOW!

Friday was our presentation day, the day we regrouped and showed off what we had come up with and describe our experiences using Visual COBOL. Now I need to mention there were developers who were new to Micro Focus that were used to coding in Java, C++, in Eclipse, on Unix/Linux who were involved. The bottom line though was they had to use Visual COBOL running under Visual Studio 2010. The end results were pretty amazing. (As an editorial note, Micro Focus has some awesome developers. The stuff these guys not only came up with but demonstrated was amazing. Back to the article).
One of the Development Directors (hey, he was a coder at one time before he became a 'Director') came up with a copyfile parsing routine. By supplying a copybook the routine would identify different segments of the copybook, identifying keys, redefines, etc. and apply different colorizations to the fields depending on usage. Here's a screen shot.
Another example was the snake game. Using the Micro Focus ADIS controls (going back to the bit about backward compatibility here) a snake game was created. A snake game is where you have to move around the screen eating letters or numbers but not over-step your tail as the snake grows in length as you eat the letters off the screen. The Team in Sophia, Bulgaria created an XML parser/generator that could read in a file and generate XML that could be used in a cloud-computing environment. Another person created a WPF tic-tac-toe game. One team of 2 even went so far as to search the internet for an interesting C# sample to see if they could take what was done in C# and convert it to Visual COBOL. (I found this example as well but am nowhere near as savvy as these guys are so didn't even attempt it). Have you ever played Visual COBOL Tetris? This version looked awesome. Great colors, quick response time, smooth graphics. All built using Visual COBOL and taking advantage of the .NET Framework 4.0. I'm hoping this example makes it into the release. It was really cool and to be honest, looked better than the C# version (sorry Microsoft). There were other examples but these were the ones presented to the excited crowd.
Did you notice one thing about the examples described? All but one were games! That's COBOL is no longer relegated to only crunching business information and generating reports. While it is still very well suited for that task it has come so much farther than that, thanks to companies like Microsoft and Micro Focus. The agreement between these two companies it translating into a lot of opportunity for companies with existing COBOL resources to become more profitable with what they have. By taking their existing COBOL source code and development teams, spending some time and money on retraining, customers will be able to realize some huge technological advances for what turns out to be little investment. Visual COBOL and Visual Studio 2010 will greatly enable customers to leap forward technically and afford them the ability to please their customes.
I'm really excited to have been a part of this effort and am very much looking forward to the release of Visual Studio 2010 and Visual COBOL. As soon as I get the OK I'll start describing the changes that are coming, not only in presentation but in the compiler. Trust me, it will make our COBOL code even easier to read and follow for non-COBOL programmers who can then learn how to code COBOL to REALLY become productive!
Happy Coding!

Friday, February 12, 2010

A Customer Visit

I'd like to share with you a visit I had last week with a decent sized insurance carrier. This company is looking to migrate their current mainframe environment to Windows Servers. Management at the company has looked at the costs, the benefits, the pitfalls, the time it will take and decided it makes the most sense for their company to do this to remain competitive as well as profitable. The management looked at all the numbers and laid down an edict to be off the mainframes by a specified date. Nothing new so far, right? I mean this is happening more and more. One would think, imagine, even expect the people "in the trenches" to be ready to mutiny! No one really asked their opinion or advice? No one asked is it technically possible? No one asked them anything. But you know what? It turns out they didn't have to, and to quote Paul Harvey..."and now... the rest of the story".

Insurance companies have taken quite a financial hit. Not only lately, but over the past decade or more. Costs are skyrocketing, resources shrinking, looking where to cut costs has become the mantra of todays' insurance companies. IT seems to take a hit every time. In a lot of companies costs are slashed, staff eliminated and those left behind have to deal with the fallout. This company was similar, they've had recent staff reductions. But the remaining staff saw an opportunity. They realized the ways of doing business were changing and they needed to look at new and innovative ways to save their company. Yeah, even though they lost friends during the last round of layoffs, they still considered the company a great place to work and wanted to help as much as possible. So that's the background.

In most companies I've been to the first, or nearly the first, sentence out of anyones mouth is "It'll never run off the mainframe" or "You can't run that on a PC". You know the first words I heard from these people? It was "Cool, what do you need to make it happen?". Attitude. They had a positive attitude and were looking for ways to not only strengthen what they currently had, but expand what they were doing to make it all work better together. Better integration. Better performance. Better code analysis. Better production runs. They saw this as an opportunity to better not only themselves, but their company.

I can't relate how refreshing it was to see this attitude. A lot of people would've been pessimistic. They were optimistic. During the engagement we exchanged ideas, worked out solutions, did a lot of 'what ifing' and came up with solutions. I'm not saying all other accounts I've visisted have been depressing or not helpful, but there was an electricity at this account. Apprehensive? Yes, but optimistic as well. All about the attitude we convey to others.

Now you may ask "what has this got to do anything"? Really not a lot, and really a whole lot. On the surface it's a feel good article about someone and someplace most of us will never go to. Deep down though its about the attitude we bring to our daily jobs and the ability to see outside of our defined box and want to expand it and ourselves. Look outside your box, look for opportunity to grow and expand yourself and your capabilities. You may be surprised at what you can do!

Sunday, February 7, 2010

To Emulate or Not to Emulate - That is the Question

Some great comments on the last posting.  Thanks everyone for your input!

(emulation at it's best)

As you may have guessed from the title, I'm going to see if I can tackle the emulation comment.  I've never really thought of the Micro Focus Mainframe Subsystem (MSS) as an emulation environment.  But then I went out to  Hmmm...seems like maybe I'm wrong to some degree if you follow that definition.  One part of the definition fits nicely "When one system performs in exactly the same way as another..", but the rest pointed out an opinion which I've always assocated with emulation, "...though perhaps not at the same speed."

My thinking has been that an emulation environment involves a "wrapper" around the application, filtering the API calls and executing lower level calls to OS specific functions to perform comparable tasks.  This means extra steps which equates to slower performance.

After additional pondering (no Rick, I didn't fall asleep!), the key component which slows things down is the wrapper layer.  However, from my use of the Micro Focus tools, I don't see the wrapper or the penalty such a wrapper would include existing.  What I see are direct calls to platform specific API's which provide the same the same functionality as on the mainframe.  The difference is that those routines are executing OS specific versions of those routines.

Maybe I'm nuts, but to me this seems a bit different.  What you end up with is something which doesn't necessarily pay the same performance price as you would see with that "wrapper layer" or shell. A still calls B, not A calls B which calls C. 

The MSS is emulating the functionality found on the z/OS environment, but I don't believe it is acting as a shell environment within which the application runs.  The statements such as EXEC CICS and READ / WRITE are calling API's which are compiled to run natively on the platform, whether UNIX, Linux or Windows.  For instance, the read statement is passed to the Micro Focus file handler which executes the appropriate lower level funtion to access the indexed file on that platform.  And if you have the need, you can even replace the file handler with your own.  The MSS is Micro Focus' own engine providing the same functionality found on z/OS for things like JES or CICS or IMS but on different operating systems.

To give you some perspective, I recently had the opportunity (during the snow storm in New York City in December - Thanks Rob! *grin*)  to do a performance test for a prospective customer.  The prospect gave us a z/OS batch routine (JCL, COBOL, VSAM)  which proccessed just under eleven GB of EBCDIC data (one input file).  On the z10 z/OS mainframe this job with this data ran in two minutes and thirty seconds of CPU time and created fifty-six separate GDG output files.  

After setting up the tools in a Windows Server 2008 R2 virtual machine managed by Hyper-V on a dual CPU quad-core HP Blade with a high speed SAN, we were able to recompile the program and run the job with no changes to either the program or the data file (we kept it EBCDIC).  In five runs, we averaged 1 minute and 20 seconds.  Is this typical?  Nah.  But is it unusual?  Nah.

If this application execution enviroinment were wrapped in an emulation layer, I just can't see how it could be faster.  It would have more to do to perform the same work and would therefore be slower.

So, yes, it emulates the functionality of the environment, but it doesn't do it by running in an emulation environment.  Almost a Yogi statement there huh?

There are a few folks who follow this blog who might be able to add their two cents to this.  Maybe they will post some thoughts.