Search This Blog

Thursday, December 2, 2010

How to be Controlling

During the project I mentioned last posting, the C# expert created several custom controls using COBOL which we used on the various WinForms in the application.  I myself have never done anything with user controls and since I haven't seen it written up anywhere else, I thought this would make for a good topic for discussion.   How do you create a custom control using COBOL?

Yep.  I'm glad you asked *smile*

Instead of starting from scratch, I figured the right approach would be to capitalize on work done by others.  And by taking that approach, I found an article at http://knol.google.com/k/creating-custom-controls-with-c-net# which made my job easy.  The article spells out how to create a custom progress bar for your application.  Seems like something useful huh?

 
Before I get started, let me make it clear that I’m not going to go into all the details on custom controls in this article. You can find more information in much greater detail and deeper understanding than anything I could write (in the referenced article for one).  I’m going to spend my time on recreating the same control Dave (the original author) outlined in his post.  The difference is that I’ll be using Micro Focus Visual COBOL and Visual Studio 2010. So, print out Dave’s directions and follow along with me while I highlight the differences.
 Getting Started
First I created the same project using the same name, but instead of C#, I selected to create a managed COBOL Windows Forms Application (didn't want to strain too much by thinking too hard too soon of course).

 
Next, just like Dave, I added a new project to the solution and call it “ProgressBar”, making sure to select managed COBOL Class library as the template.
 I then added the reference to the new project as directed and was ready to create the control.   As with the C# example, Visual COBOL created a shell class program for me to use as a base. And as in the example, I deleted it because I wouldn’t be using it.

 From the Solution Explorer, I selected the ProgressBar project and right clicked to Add a new item, selected “User Control” from the menu and named it “ProgressControl”.  

  Once the user control was created, I set the properties as Dave had them:

  •  BackColor: Window
  • BorderStyle: FixedSingle
  • Size: 148, 14
  • Double-Buffered: True
As for the Code
Since I used COBOL, you’ll notice the code behind page that was created for the ProgressControl is very similar to the C# version.

Next, I added the code to allow for the choosing of a foreground color for the progress bar  and added the “System.Drawing” namespace to the program so that I didn’t have to fully qualify things like “System::Drawing::Color” in my COBOL code. I just want to type “Color” same as Dave did in the C# routine. To do this, I typed the appropriate $set statement at the top of the program. While I was there, I also added a line for “System.Windows.Forms” because we’ll need it later.

You’ll also see I added the variable definition to the working storage section for barcolor.  In the original C# example, Dave typed:
The biggest difference is that I had to write a bit more code for the get and set methods than what the C# had. Not much, but I had to spell out each method, along with its own working storage, procedure division, etc.

 
Once that portion was coded, I had to make a slight change to the code to create the Value property. Because Value is a reserved word in COBOL. So, instead of using “Value” as a variable name, I used pvalue. I placed the the working storage definition directly after the definition for barcolor (see above).

I used COMP-1 because this is the COBOL equivalent of a short Float.  And as with the ForeColor get and set methods above, I wrote the two methods to do what the C# code does:


 Again, you’ll notice that I used COMP-1 when defining the field that is to be used as a short Float. The last bit of code looks like this:

Two items may stand out here. The first is the invoke statement “invoke super::OnPaint(e)”. Simply put, C# uses the keyword “base”, while COBOL uses “super” to refer to the original superset or base version of the method. Dave’s article points out that this particular statement is optional in his simple example but recommended for more complex paint routines. If I had used “self” instead of “super”, the code would be referring to the current version of the method (the code shown above). That would cause a recursive loop where every time the method was called, the first thing it would do is call itself. That situation ultimately consumes the memory of the machine (guess how I know *smile*) causing Visual Studio to abend.

 
The second item which may stand out in this piece of code is that in the C# example, Dave used “Int” to remove the decimal values from the resulting calculation. Instead of dealing with that, I just created a width field with no decimal value. Everything else, I followed the directions provided by Dave. And much to my surprise it worked first second  third time I hit the run button.

Of course I left out all the mistakes I made trying to decipher the C#, but I never claimed to be a C# programmer now did I? That’s what I call my buddy Mike for.  *smile*

Hopefully you find this of interest / value.  If you find any errors in my code or suggestions on how to make this better, by all means, please share!  All contributions greatly appreciated!

Wednesday, October 20, 2010

C# Expert Makes COBOL.Net Scream

This week I'm spending time working with a C# expert on a COBOL modernization project.  What's interesting is that this individual is using his knowledge of C# to re-architect an existing COBOL application, and the results will still be COBOL. 

While I'm admittedly a novice on the ins and outs of .Net, he's extremely strong in both the framework and object oriented design.  Combine his know-how and a basic education on COBOL.Net and in less than two weeks, we've converted the application front end to Winforms and tied it to the existing COBOL business logic.

Why is this interesting?

This proves that your company can give your COBOL developers and your .Net developers a tool like Micro Focus Visual COBOL, a week of basic education on COBOL.Net and ...

TA-DA!

You can get the best of both worlds.  You get a group who knows both Microsoft .Net and COBOL who now have the basic ingredients to begin modifying your COBOL applications to fit your current corporate IT direction.  Comingle the groups and the technologies and you'll get several things:
  • A brand new application based on tried and true application source leveraging an industry standard framework.
  • A energized team of people educated in both your mission critical applications and the framework you've adopted as your corporate direction.
And the team will really enjoy doing it.

Huh?

Yep.  I've seen it this week.  People who want to be involved are stopping by every single day asking for details and wanting to know how they can help.  This includes folks from both sides of the development shop!  Those already on the team are having fun doing something they never thought could be done.

It's true.  Put developers to work breathing life into these systems and they will enjoy the work.  Developers like a challenge and cool tools.  Both sides of the team will become immersed in learning their parts of this "new world" and the religious battle about languages will become secondary to the mission.  Each group will use the tools and language that meet their particular needs and you will get an application which is based on the tried and true applications which have been running your business.

Don't believe me?  Give it a test drive.  Put your own team together, mixing folks from both sides of the fence and give them the basic goal of bringing an existing COBOL application forward to the .Net world.  Add in some education / guidance and see what happens.  Let me know the results!

Sunday, October 10, 2010

ASP.Net, ADO.Net and COBOL - The Right Tools for the Job

Hey folks,

In working on learning how to write a web application using COBOL, I thought it would be interesting to understand how to retrieve a BLOB (binary large object) using ADO.Net and display it back to the user.

So, armed with a downloaded copy of the AdventureWorks database from Microsoft's website, I set about to create an ASP.Net page and its corresponding "code behind" page (out of COBOL of course) to do just that. 


First off, you'll see from the ASP.Net page in the image above that I'm populating a drop down list box with data from one of the tables in the database.  Once the user selects a row from the list, the code behind page launches the "Fetch_ProductImage" method defined within my COBOL program (to see the complete images you may have to click on them to make them full screen).


The image above shows the various objects I had to define using some of the "new" data types I mentioned in an earlier post to this site.  Addittionally, you'll see that I define objects based on the ADO.Net class.  How I use them is shown below in the  method "Fetch_ProductImage".  This is used to grab the image stored within the BLOB field for the selected item:


Once selected the image is displayed to the user via the web browser.  Pretty slick huh?

It's much like using cursors with traditional SQL, with many of the same steps.  In traditional SQL, you define the cursor, open the cursor, read the row(s), then close the cursor.  Similar thing with the data reader.  You define the statement you wish to execute against the table, open the data reader, load the data reader with the rows you wanted, read the row(s) from the data reader, close the connection.  Very similar processes.

One interesting difference I've found is that with a data reader, the rows are stored as "read only".  With traditional SQL you can choose to update the rows contained within the cursor.  But within a data reader, the data can't be modified.  And as I believe is true with a cursor, you can only read forward with the data reader.  Once you have read the row, the previous row is no longer available to you.

There are several other interesting things you can do with ADO.Net.  Just to give you an idea... Everything above was based on examples in the first 50 pages from a 585 book I bought on ADO.Net.  Yes it was written for a VB.Net developer, but I was able to translate it from gibberish into COBOL easily enough *grin*.

Overall it wasn't too difficult. Yes, I know I didn't go into details on how to do all the steps involved with the web form portion of this.  That was because I figure that was the easier part and you too can pick that up from a good book on building ASP.Net web pages.  As I mentioned in a previous post, I've been working from Imar's book "Beginning ASP.Net 4 in C# and VB".  As to the rest, it is all shown above in the COBOL example.  There's surprisingly not that much to it huh?

I hope this was of value for you.   Drop me a note if you have any questions or comments.  I'd love to hear from you!

Sunday, August 15, 2010

Mind The Gap

Once upon a time, I was told that Cobol developers couldn't cross the gap to learn that "object oriented" stuff.  Why can't we? Could it be that we aren't smart enough? 

I think not.

I believe it has to do with the combination of syntax and methodology.  Let's face it, Object Oriented Cobol syntax from the 2002 standard is confusing.  Too many quote characters cluttering things up is my opinion.  Additionally, to use it, there was this object oriented approach to program design we were supposed to learn.  And because we had to learn a new language and we were coming from a procedural-based frame of reference, it stumped us.  I think we were trying to learn too many new things at once.

*cue creepy music*

I have a secret...

For the last couple of weeks,  I've been working secretly in my lab deep beneath the dungeon cooking something up.

I have been working to come to grips with...


Dare I say it...Cobol.Net and ASP.Net. *gasp*

(Ok, I was in my home office in the basement.  Anyway, back to the story)

I bought a book by Imar Spannjaars titled "Beginning ASP.NET 4 in C# and VB", and I had this thought... I wondered if I could translate what Imar was trying to teach the VB  and C# crowd into Cobol.Net syntax.  Guess what?  You can.  *smile*

First off, I've discovered that the new syntax the Micro Focus development guys have put together has made it easier for me to translate from VB.Net to the Cobol.Net.

And you want to know what else I found?  The Object oriented approach started to make more sense.  Not just basic sense but the kind of common sense you hope your teenage son finally gets before he graduates college (I keep my fingers crossed). 

I found that for the most part I only had to grasp a handful of concepts and I could write slick web-based ASP.Net applications using Cobol.Net. 

Basically it comes down to the new data types I've mentioned in earlier posts and two "new" statements, Set and Invoke.  For instance, here is how you use the Set statement.


Translation:
 If the HasFile field of the FileUpload1 control on the current web page is true (think of it as an 88 level switch), set ws-filename to the value stored in the FileName field of the FileUpload1 control on the current web page.  If it doesn't, set the text of the UloadSuccessMessage object to "No File Selected. Unable to Upload" and make it visibile to the user.

And guess what?  Invoke isn't much different.


Translation:
call the Redirect routine of the Response "program" that is tied to the current web application and pass it the URL of "Default.aspx?Redirect".

Based on this rather amazing revelation (amazing to me anyways), I've come to the conclusion that yes Cobol programmers can make the leap.

It isn't really that much of a leap actually, but more like a series of small steps. 

Additionally, I believe that C# and VB folks can understand the basics of this new version of the Cobol.Net syntax. 

In the near future, I hope to post the complete source for the web application I've been building and you can see for yourself what I'm talking about.  This stuff ain't rocket science.

Just in case you didn't notice, you too can give this a try.  Download a copy of Visual Cobol for your home machine for 30 days (www.microfocus.com/visualcobol) and give it a shot.  Can't figure out how to do something?  Maybe we can figure it out together.  Consider it a learning experience *grin*

Friday, July 23, 2010

COBOL: Still Learning and Growing


Continuity



I was recently asked about the direction of COBOL and whether or not I believed the language would continue for very much longer. To say I was taken aback by the question would be a serious understatement! I pointed the person to an article I did last year about COBOL not only being everywhere but what it could do. The article was titled “COBOL: It’s everywhere” and can be found at http://www.c-sharpcorner.com/UploadFile/RSM50/EVRYWHR09032009152715PM/EVRYWHR.aspx . Now maybe I’m being rather naïve, but really why wouldn’t COBOL not only continue but grow and adapt? All one has to do is to take a serious historical perspective of the COBOL language and one can see not only continuous enhancements to the language, but continual expansion of the language. Let’s take a quick historical review of COBOL, shall we?

Yes COBOL is old. It’s over 50 years old already. In most cases it’s been around longer than most of the developers working in a given shop. Just because something is old though doesn’t mean it’s still not useful and serving a purpose. We’ve heard the examples before, “Have a credit card?”, “Use a cell phone?”, “Have a house mortgage?”, “Pay income taxes?”… all and many more examples are being ran by COBOL systems in some point in time. Micro Focus asked a reporter once to live a single day without interacting in some manner, shape or form with COBOL and it couldn’t be done. So it may be old, but it’s also very wide-spread and serving us day in and day out without any recognition. Now let’s look at how COBOL has grown through the years.

Evolution

Even though COBOL is old, it has not stopped learning. COBOL has gone through a number of revisions. To set the baseline let’s take a brief look at each of the versions of COBOL and identify the key contributions made to the language starting with ANS COBOL 68 (yes that’s 1968).

  • ANSI COBOL 1968 In December 1960 a COBOL program was successfully compiled and executed on two different platforms without requiring any code changes. This demonstrated the concept of compatibility to the early pioneers and the need to ensure the language remained consistent across multiple vendors. During the course of the next several years however compatibility suffered. The American National Standard Institute (ANSI) set about the task of re-establishing compatibility. COBOL 68 was the first ‘official’ release of the language.


  • ANSI COBOL 1974 This version was considered by many to re-emphasize the need for compatibility. It contained a number of features that were not in the original version.


  • ANSI COBOL 1985 Another revision of the original version with new features. One of the most important was the introduction of structured constructs. This included the inclusion of scope-terminators such as ‘END-READ’ and ‘END-IF’ to name two. This release was geared towards standardized coding techniques.


  • ANSI COBOL 2002 A significant enhancement of the language was accomplished with the 2002 release of the language. Several of the key concepts introduced included National Language support including support for the Unicode character set, user-defined functions, calling conventions to and from non-COBOL languages, framework support (.NET and Java), floating-point, and XML generation and parsing. One would characterize this release as geared towards interoperability.

Let’s look at the above information in a different perspective. The different versions of the language are presented with the key object each version achieved:

ANSI 1968: Language Standardization
ANSI 1974: Language expansion
ANSI 1985: Technique standardization
ANSI 2002: Interoperability

From this we see a progression of maturity in the language beginning with defining a standard manner in which to communicate, to growing in capabilities and finally to enabling communications with others. A logical progression and growth enabling COBOL to become capable of achieving many more tasks than ever imagined by its creators. But yet the perception exists that COBOL can’t do anything more than add numbers together. But what about the question posed earlier, “Will COBOL be around for very much longer?” Well the NEXT standard has been in development for a while and is due to be released for public comment in August 2010.


Continued Expansion


The next standards goals are to continue to expand the capabilities of the language while making it able to interact with the other languages of the day. To that end the following are being proposed for inclusion in the standard
— Dynamic-capacity tables
— Function pointers
— Any-length elementary items
— Increased size limit on nonnumeric literals
— Enhanced locale support in functions
— Support for industry standard floating-point formats and arithmetic, including multiple rounding options
— Structured constants
— Enhanced date and time handling
— Parametric polymorphism, also known as method overloading



If you’d like to follow the process of the standard you can visit http://www.cobolstandard.info/j4/ . We as COBOL developers should be aware of the changes being proposed to the language we work with on a daily basis so that we may be able to adapt our techniques to take advantage of the new constructs in our toolbox.

Awakening



What has to happen though is for the development community world-wide to realize what a valuable asset it has in all this COBOL code floating around the world and what a versatile language COBOL really is. Companies have invested billions of dollars in these systems. They’ve spent years tweaking, tuning, refining the logic to do just what they need. Then someone says “COBOL ain’t cool” and they look at getting rid of it. Why? We can interact with COBOL in ways unheard of before with .NET being a primary example. Let’s wake up and realize the enormous economic investment that has been made of COBOL and instead of trying to replace it, let’s help make it do even more. Look at new platforms for performance gains, look at new frameworks such as .NET for interaction but let COBOL keep doing the job it’s been designed for…making money! Will COBOL be around for another 50 years? Time will tell but with its demonstrated ability to grow, adapt and expand I would bet on it.



Let’s hear what you have to say. Do you code COBOL? Does your company have applications written in COBOL? How are you using it? Are you looking to expand it? Have you or your company considered expanding what COBOL can do to achieve business success?

Thursday, July 1, 2010

The COBOL Version: Going The Way of Jello?

Once upon a time, every customer RFI had a question about the version of COBOL the compiler supported.  That was years ago.  I can't tell you the last time I got that question.  Not only that, I can't recall when the last time anyone cared really.


Why is this?  Have COBOL versions gone the way of gelatin?  Have they blended so much or become so generic that eveyone calls it by the same name, regardless of what version it is?  Name another brand of gelatin.  (Bet ya can't)

Well, one reason I personally never get the RFI question is probably due to the fact that the Micro Focus dialect has become the defacto standard if you are working off the mainframe.  And that just happens to be the toolset I work with.  Yes, it is nice to be with the market leader *grin*.  It does make life easier sometimes.  The only version related question recently was "Do you support Enterprise COBOL?".   Again, this was probably due to this being the version on the mainframe they used.

With these two exceptions in mind, I still don't see why companies have lost interest in the latest version of the COBOL standard.  Ask a shop working with Java and you'll find out quickly if they are on 1.3 or a 1.6, etc.  Same thing with the .Net framework.  I guarantee you'll find out quickly if a company targets  the 2.0 or 3.0 version of the .Net framework.

But the COBOL version?  Maybe my imagination, but it doesn't seem to be that important anymore.

Could it be that everyone lumps every version into the same bucket?  Its all just COBOL right?

As many may or may not know, the last published COBOL standard was released in 2002.  There were a number of items in the 2002 edition related to Object Orientation which truely brought the language into the modern era.  From the notes on the COBOL Standards website, there are references to a supposed release targeted for 2008 which were supposed to move it even further forward.  I wonder what ever happened to that version.  Anyone know?  Why did it die on the vine?  I know for a fact the group still meets and is working on advancing the language.  But why has it remained unpublished?  Isn't it about time a new version was announced?

Oh well, I digress.

What I'm most curious about is your company's conformance to a particular COBOL version or published standard.  What version do you use?  Are there any elements of the 2002 standard that your company uses or conforms to?  Or are they tied to a vendor specific version of COBOL like Micro Focus or Acucorp?


A more basic question...Do you know what is in the 2002 edition of the COBOL standard?  You can find it at the ISO website just in case you were curious...

And one other question before I finish up on the topic.

If a new version of the standard were to be released on the world, what would it mean to you?  To your company?  From what I can see, it would be a safe bet that very little discussion would occur around the subject until somone needed to upgrade their compiler. 

Kinda like the "if a tree fell in the woods, would it make a sound" question. *smile*

With my limited knowledge of what is being discussed by the group, this may ultimately be a bit of short-sightedness if your company uses COBOL but isn't up on the subject.  There may be elements within the standard which would allow your company to save money or do more with the language.

All things considered, how would you propose the guys over at the COBOL Standards Group go about getting your attention?

What needs to happen to bring this to the forefront in your shop?

Just curious.

Wednesday, June 9, 2010

Declarative Sorting Of Dynamically Allocated In-Memory Tables With COBOL

Visual COBOL = Visual Studio COBOL
One of the things COBOL is really good at is data processing.

Its declarative data model makes it second only to SQL for this purpose. Both in Managed COBOL and native COBOL, Micro Focus COBOL can sort an in-memory table just by saying 'sort'. Not only that, a little trick with pointers and the linkage section means that we can sort any sized table and allocate the memory for that table on the fly. Here I am using Visual COBOL (Visual Studio COBOL) to work through these ideas. To do this, I am using an imaginary example of a retail data control system where we have product variants and stock keeping units (skus). Our little program shows how we can sort variant name/sku pairs by the sku number.

What does declarative mean?

Here we simply declare the structure of our table and how it is indexed and can be sorted:

linkage section.
       01 variants occurs 1 to 1000 depending on variant-count
                   ascending key is sku
                   indexed   by jump-start-index.
           03 sku  binary-long.
           03 decr pic x(20).

Because we have put this in the linkage section, we have declared all this but we have not allocated any storage to it yet - that comes later. Now we sort:

sort variants

Now that is the bit I love! Having declared the data and key structure of the table, COBOL knows everything it needs to know about how to sort it. We do not have to call some sort function passing in some functional thing like a lambda or a call back. All that complexity is taken away. We want the table sorted - so we tell COBOL to sort it! If only my kids were so easy to instruct...

How About Pointers And Allocation?

As mentioned, the declaration of the table is in the linkage section. I have also created a working storage item 'mem-ptr' which is usage pointer. This approach means we can use CBL_ALLOC_MEM to assign a block of memory to use for storing our table. The size of the block to allocate is worked out from the size of each record via the "length of" keyword pair and multiplying the result by variant-count. The location of that memory is pointed to by mem-ptr and we can tell COBOL to use it for our table by the statement "set address of variants(1) to mem-ptr". However, COBOL does need to know how big the table is. This is done by using variant-count in the "depending on" clause of the table declaration.

Putting It Together In Visual COBOL

First open Visual Studio:


Then start a new project:


Then pick a native COBOL template:


Take the created source and replace it with the source below:


$set sourceformat(variable)
       program-id. "sorter".

       working-storage section.
       01 mem-ln        binary-long.
       01 mem-flags     pic x(4) comp-5 value 0.
       01 int-bl        binary-long.
       01 mem-ptr       usage pointer.
       01 status-code   pic x(2) comp-5.
       01 variant-count binary-long.
       
       linkage section.
       01 variants occurs 1 to 1000 depending on variant-count
                   ascending key is sku
                   indexed   by jump-start-index.
           03 sku  binary-long.
           03 decr pic x(20).
           
       procedure division.
       
           move 10 to variant-count
       
           set mem-ln to length of variants
           multiply mem-ln by variant-count giving mem-ln
           
           call "CBL_ALLOC_MEM" 
                using     mem-ptr
                by value  mem-ln mem-flags
                returning status-code
           if not status-code = 0
               display    "Failed to get memory"
               stop run
           end-if
           
           set address of variants(1) to mem-ptr
           
           move 1234       to  sku(1)
           move "beans"    to decr(1)
           move 12         to  sku(2)
           move "fish"     to decr(2)
           move 4532       to  sku(3)
           move "cat food" to decr(3)
           move 2342       to  sku(4)
           move "dog food" to decr(4)
           move 1231       to  sku(5)
           move "sauce"    to decr(5)
           move 1254       to  sku(6)
           move "bread"    to decr(6)
           move  999       to  sku(7)
           move "chicken"  to decr(7)
           move    1       to  sku(8)
           move "beer"     to decr(8)
           move    2       to  sku(9)
           move "wine"     to decr(9)
           move  9999      to  sku(10)
           move "tnt"      to decr(10)
           
           sort variants
           
           perform varying int-bl from 1 by 1 until int-bl greater variant-count
               display sku(int-bl) " =- " decr(int-bl)
           end-perform
           goback
           .
       end program "sorter".

Giving this:


Now click in the left margin to put a break point on the "goback":


Now step into the program using the debugger and it will stop on the break point:


We can now see the sorted output!


For more stuff like this - check out the Visual COBOL Knol and The Managed COBOL Knol community sites

Wednesday, May 19, 2010

Why APM Is Important To COBOL?

Some of you may be asking "why is APM relevant?".  What has application portfolio management got to do with COBOL?  And how does it relate to everything else I've read on this blog?

Let me connect the dots.  (forgive me if I wander a bit *smile*)

To many folks, COBOL is "old code" that has got to go.  The reasons many provide usually have something to do with the user interface or the data, but not the core functionality these applications provide.  Those using the applications on a daily basis hate the old character interfaces.  This is thanks to the fact that everything else they use has a slick web interface and they can generate their own adhoc reports without waiting for weeks or months for IT to get around to it.

To quote a CIO I spoke with recently "Users hate using the character screens but we couldn't do business without the functionality the systems provide".  He also had issues with the data.  Since everything was "locked" into the applications, they had to create a data warehouse which replicated the various information and made it more accessible to the business users.

He really really really would like to do something with the applications, but he didn't buy into the "let's rewrite it in (your language here)" pitch.  This is due to the fact that he realizes those applications are very complex and rewriting them introduces unacceptable risk.  If he understood how these applications worked, he might be able to do something with them.

I saw another example of this today while visiting one of the largest insurance companies in America.  One team within the company was tasked with trimming $100 Million out of the company's operation budget.  To do so, they started asking questions about their mainframe-based COBOL applications.  What they found was that no one understood everything these applications did.  Nor did they know how interwoven the various applications were with the rest of their infrastructure.  Sure they knew pieces and parts, but no one had the big picture.

Just to make sure you didn't miss my point, no one in this multi-billion dollar company understands how their systems work or what they consist of.

And we are talking a company with literally hundreds of architects, project managers, application developers, etc.  Even those who have been with the company 30 years only knew what parts of the various application elements did or were comprised of.  And these folks are making multi-million dollar decisions based on what they know about their systems every day!

Yikes!

I've digressed a bit, but hang in there, I can draw it all together. I think... *wink*

How would both of my examples solve their problems?  APM.

With an APM-type solution, that old code becomes a reusable building block for the new applications your company needs.  By using an APM solution to gain an understanding of the application and its composition, you could then take that "old COBOL code" and re-purpose it with a new user interface or by opening up the data layer. 

With an APM solution, companies can see how to reduce their IT expenses by identifying dead application elements, unnecessary application complexity, duplicate functionality and data.  APM-based solutions can provide management a real-time picture of these systems.  Who knows, they might even realize how risky a complete rewrite to (your language here) really is.

Recall the earlier posts on Visual COBOL? (see how I wove that in there?  slick huh?)

Once you have identified those pieces which are key to the business, you could use a tool like Micro Focus Visual COBOL to create a COBOL-based web service.  Then those who are building the new user interfaces could use whatever language they want to build that slick new user interface.  And by doing so, COBOL continues to have a place in the environment.

Or what about the post on migration?  (yes that was another post I made) With an APM solution, you can identify what pieces of an application can be moved off the mainframe to a Windows or UNIX or Linux server.

So, back to my point.  I believe that APM-based technologies are critical to the future of COBOL. 

They can give new life to those COBOL systems...

That's my thinking anyways.  What's yours?

Thursday, April 22, 2010

Application Portfolio Management - Best Practices

It's tough to control your application portfolio. Your systems have been developed over the course of years or even decades. The people that developed the apps may have moved on to other roles and documentation is out of date. So, the portfolio gets more and more complex. And that complexity means they are slower to adapt and more expensive to maintain.

Application portfolio management is an approach to help return control to application managers. In this series of posts, I'll take a look at best practices for managing the application portfolio.

  • Goal: Constant fire-fighting is no way to run a development organization. Especially in today's era of tight budgets and fast change. In this post I'll summarize the goal of APM and set the stage for a discussion of best practices. Read the post.

  • Questions and Metrics: APM data should answer questions that address a specific goal. Say, ‘why is this business process inflexible?’, ‘where can I cut costs?’, or ‘where is my software architecture flawed?’. To answer these different questions requires different combinations and weightings of data (user surveys, application code, or external sources). Sometimes more of one source, sometimes another. Read the post.
  • Decision-Makers: APM data needs vary based on where you are in the organization. Higher level managers require higher level abstractions, particularly of technical metrics. Also, different types of users will have different data needs. An architect may want technical complexity data, but it may only be meaningful to him if it is filtered by architectural models. Read the post.
  • Maturity: There are different levels of maturity for decision-making. This maturity directly affects which metrics are accessible in the first place and also indirectly because it determines the kind of business goals that an organization is prepared to address. Read the post.
  • What’s next: As a particular initiative moves from “decision” to “action”, different data may be needed. More “bottom-up” data may be necessary to implement the decisions at this stage. Further, different metrics can be monitored to ensure the success of a given development or modernization project as it is executed. Read the post.

Friday, April 16, 2010

Precision: What is it Precisely?

Alex has an interesting post over on Nerds Central which I believe you may find of value.

He details a couple of useful items specific to decimal precision in Cobol and Java.  It is interesting  because it illustrates the use of a Cobol data type I've not played with, float-long. 

Float-Long came about a few years ago, along with Float-Short and Float-Extended and was made part of the 2002 Standard I believe.  To make a long story short, Float-Long isn't quite as accurate as you would hope, but does have its uses.  Alex provides a complete example showing how to acheive both an exact arithmatic answer using a Comp data type and how to use the Float-Long data type.

The other reason I find the post of interest is that in his article, it appears the Java code can only duplicate what the Cobol Float-Long data type provides (which is inaccurate), not the more precise answer he achieved using the more standard Cobol data type of Comp-1. 

Am I mistaken in my interpretation of the Java sample he provided?

This looks like a red flag to me if your Java code is doing any math which requires a high level of accuracy on the right side of the decimal.

Can someone verify this and post a Java-based example confirming/denying the problem?

I'm curious but only know enough Java to be able to nod in the right places when talking with a Java developer. *smile*

P.S.:  The Visual Studio 2010 Launch in Las Vegas was quite the event.  Thanks to various issues with my plane ride (rerouted due to medical emergency and then detained in Albuquerque, NM due to mechanical issues, etc), it only took me 20 hours or so to make the 3 and 1/2 hour trip home.  At least I wasn't flying to London!

Monday, April 12, 2010

Visual Cobol. You gotta see it

Hey folks!

I'm in lovely Las Vegas on the conference floor talking with customers about the new Visual Cobol release which is being announced at the Visual Studio 2010 Launch.

So far the feedback has been that this is really slick stuff indeed.  This is the first time I've gotten a chance to look at the new syntax support.  Micro Focus development has done a tremendous job simplifying the syntax.  If you haven't looked at the articles out on C# Corner, go take a quick look.

Oh well, gotta go.  Mike wants his laptop back!

Wednesday, March 3, 2010

JCL and Procs Running on a Windows Server Near You!

Hey folks,

Sorry to be slow in posting anything on mainframe migrations that I promised. 

Seven weeks ago I was grabbed by the neck and tossed into shark infested waters on a pilot project for a large retailer.  They were looking into the idea of turning off their mainframe.  For some reason or another they didn't just plop down their checkbook and write out a check.  "Run our application on a Windows?  Prove it." they said. 

So after many fun days and nights of work we demonstrated their application running off the mainframe.  Nothing like reinventing an environment which took twenty years to setup in a handful of weeks.

Looking back, we spent almost all of our time building out the plumbing.  Getting three CICS regions stood up that matched the mainframe, building out their PDS, Loadlib and Joblib structures on the server, wiring in an ISC link back to the mainframe, setting up Ezasockets, etc., took most of the time.  The application elements (BMS, JCL, PROCs, COBOL, etc) came down with no changes.  And once we finished the wiring, those elements ran just fine after compilation and whatnot.

In the past, the approach would have been to convert the JCL to a VB Script.  But nowadays JCL can be run pretty much as is.  The biggest caveats are around third party tools such as FileAid, Easytrieve, and so on and so forth.  Alternate approaches to those elements may have to be put together before the jobs can run.  But the core elements can run just like they are.

For instance, consider the sample JCL below:

//JCLTEST JOB 'JCL TEST',CLASS=B,MSGCLASS=A

//*
//* DELETE EXISTING DATASETS
//*
//GETRID EXEC PGM=IDCAMS
//SYSPRINT DD SYSOUT=*
//SYSIN DD *
DELETE MFIJCL.OUTFILE.DATA
SET MAXCC=0
//*
//* ALLOCATE AND WRITE RECORDS TO A DATASET FROM A USER PROGRAM
//*
//CREATE EXEC PGM=JCLCREAT
//OUTFILE DD DSN=MFIJCL.OUTFILE.DATA,DISP=(,CATLG),
// DCB=(LRECL=80,RECFM=FB,DSORG=PS),
// SPACE=(800,(10,10)),UNIT=SYSDA
//SYSOUT DD SYSOUT=A
//*

//* USE THE IEBGENER SYSTEM UTILITY TO COPY RECORDS INTO A TEMPORARY
//* DATASET AND PASS ON FOR USE BY SUBSEQUENT STEPS
//*
//GENER EXEC PGM=IEBGENER
//SYSPRINT DD SYSOUT=*
//SYSUT1 DD *
Rec 1
Rec 2
Rec 3
Rec 4
Rec 5
//SYSUT2 DD DSN=&&JTEMP,DISP=(,PASS),
// DCB=(LRECL=80,RECFM=FB,DSORG=PS),
// SPACE=(800,(10,10)),UNIT=SYSDA
//*
//* READ RECORDS FROM THE TEMPORARY DATASET CREATED IN THE PREVIOUS
//* STEP
//*
//READ EXEC PGM=JCLREAD
//INFILE DD DSN=&JTEMP,DISP=(OLD,DELETE)
//SYSOUT DD SYSOUT=A
//

It runs just fine on a Windows or UNIX box using the Micro Focus tools.  And before you ask, yes GDG's and restart logic work too.  My sample just didn't have any in it and I didn't go looking for a better sample.  You'll have to use your imagination.

The hardest part for the project involved getting the customer's CICS user exits brought over and working.  I had to wire in ISC links back up to the mainframe CICS region to execute routines not included in the downloaded project components.  But I digress... *smile*

I was talking JCL and Procs.  I have yet to find JCL that runs on z/OS which won't work... Those things I have that don't work are the utilities or third party tools, not bone-stock JCL.  Does anyone have a specific question on migrating JCL that they are dying to ask?

Saturday, February 27, 2010

Are you 'Visual'?

If you're like a lot people you learn more by observing how a process or task is completed than by having it explained or reading a book. While manuals and instructions are important, having hands-on experience with something reinforces the training being received and really 'drives it home'. I am one of the visual types myself. I like to work with a new process or product, dig into it, see what it's made of, try to push it to its limit and see what happens. This past week I had a wonderful time in Newbury, England with the Micro Focus Development Teams kicking the tires on Visual COBOL.


"Visual COBOL"?!?!? Yep...Visual COBOL. Now this isn't some marketing campaign kicked off by the Micro Focus Execs to try and sell more re-branded, same old run of the mill compilers (trust me, this ain't that at all). This is a whole new product that was developed to take advantage of the Microsoft .NET environment and platform. Before I explain about Visual COBOL I need to fill in some details. On or about April 12th Microsoft is going to be releasing Visual Studio 2010. This is going to be a major release for Microsoft, along with several other products. There will also be a new version of the .NET Framework being released. As part of the Global Agreement between Microsoft and Micro Focus, Micro Focus is required to release a product that will take advantage of the new architecture and features of the Visual Studio 2010 product... and do it within 30 days of the VS2010 launch date. Enter Visual COBOL.


Visual COBOL is 'not the same old COBOL' from Micro Focus. Not even close. Oh there are similarities; like a 30 plus year history of COBOL development experience, a lineage that has spanned multiple platforms, multiple language revisions, and dialects and has had more developers banging away on it than most other language development environments. But that's where the similarity ends. Visual COBOL is a whole new development product. The folks in Newbury took the knowledge and experience they have, married it with the excellent technology of .NET and created a whole new compiler and runtime environment. As some people have said in the past about COBOLs' growth..."this ain't your Daddy's COBOL" and they are so right. Now don't assume just because it's a "whole new development product" that your old COBOL code won't work on it...wrong answer Skippy! Although the development product is brand new, Micro Focus has taken great care to ensure backward compatibility. To that end, back to this week!


The week in Newbury was designed to bring a group of people from outside Newbury, along with the Newbury and Sophia Development Teams together to really kick the tires on Visual COBOL. The goal was to see where the product was in relation to the project timeline, what issues were noted, and what could be resolved before the go live date. Everyone brought along samples of code and the first thing everyone did was try out their code on the compiler. I can attest that 100% of my previous code compiled and executed. The Team from Japan was also very impressed that not only did their code execute, but the interface was accurate with very few issues noted. The Team from France had similar results. The major portion of the week though was devoted to a process called 'STX' or Super Test eXtended. STX was a three day programming marathon session by everyone involved. All development on Visual COBOL was halted. The Development teams were split into teams of two and each was to come up with a project using Visual COBOL. They were supposed to use Visual COBOL as you or I would, as an end-user. All coding was to be done in COBOL, only COBOL. This was the first time I participated in an exercise such as this. To put it mildly, I felt like a fish out of water. While I like to think I'm a good coder I felt like a kid out of elementary school around these people. The way their minds think and the things they came up with were amazing. And their ability to go from concept to prototype in a very short period of time was awesome. The end result...WOW!


Friday was our presentation day, the day we regrouped and showed off what we had come up with and describe our experiences using Visual COBOL. Now I need to mention there were developers who were new to Micro Focus that were used to coding in Java, C++, in Eclipse, on Unix/Linux who were involved. The bottom line though was they had to use Visual COBOL running under Visual Studio 2010. The end results were pretty amazing. (As an editorial note, Micro Focus has some awesome developers. The stuff these guys not only came up with but demonstrated was amazing. Back to the article).
One of the Development Directors (hey, he was a coder at one time before he became a 'Director') came up with a copyfile parsing routine. By supplying a copybook the routine would identify different segments of the copybook, identifying keys, redefines, etc. and apply different colorizations to the fields depending on usage. Here's a screen shot.
Another example was the snake game. Using the Micro Focus ADIS controls (going back to the bit about backward compatibility here) a snake game was created. A snake game is where you have to move around the screen eating letters or numbers but not over-step your tail as the snake grows in length as you eat the letters off the screen. The Team in Sophia, Bulgaria created an XML parser/generator that could read in a file and generate XML that could be used in a cloud-computing environment. Another person created a WPF tic-tac-toe game. One team of 2 even went so far as to search the internet for an interesting C# sample to see if they could take what was done in C# and convert it to Visual COBOL. (I found this example as well but am nowhere near as savvy as these guys are so didn't even attempt it). Have you ever played Visual COBOL Tetris? This version looked awesome. Great colors, quick response time, smooth graphics. All built using Visual COBOL and taking advantage of the .NET Framework 4.0. I'm hoping this example makes it into the release. It was really cool and to be honest, looked better than the C# version (sorry Microsoft). There were other examples but these were the ones presented to the excited crowd.
Did you notice one thing about the examples described? All but one were games! That's right...games. COBOL is no longer relegated to only crunching business information and generating reports. While it is still very well suited for that task it has come so much farther than that, thanks to companies like Microsoft and Micro Focus. The agreement between these two companies it translating into a lot of opportunity for companies with existing COBOL resources to become more profitable with what they have. By taking their existing COBOL source code and development teams, spending some time and money on retraining, customers will be able to realize some huge technological advances for what turns out to be little investment. Visual COBOL and Visual Studio 2010 will greatly enable customers to leap forward technically and afford them the ability to please their customes.
I'm really excited to have been a part of this effort and am very much looking forward to the release of Visual Studio 2010 and Visual COBOL. As soon as I get the OK I'll start describing the changes that are coming, not only in presentation but in the compiler. Trust me, it will make our COBOL code even easier to read and follow for non-COBOL programmers who can then learn how to code COBOL to REALLY become productive!
Happy Coding!






Friday, February 12, 2010

A Customer Visit

I'd like to share with you a visit I had last week with a decent sized insurance carrier. This company is looking to migrate their current mainframe environment to Windows Servers. Management at the company has looked at the costs, the benefits, the pitfalls, the time it will take and decided it makes the most sense for their company to do this to remain competitive as well as profitable. The management looked at all the numbers and laid down an edict to be off the mainframes by a specified date. Nothing new so far, right? I mean this is happening more and more. One would think, imagine, even expect the people "in the trenches" to be ready to mutiny! No one really asked their opinion or advice? No one asked is it technically possible? No one asked them anything. But you know what? It turns out they didn't have to, and to quote Paul Harvey..."and now... the rest of the story".

Insurance companies have taken quite a financial hit. Not only lately, but over the past decade or more. Costs are skyrocketing, resources shrinking, looking where to cut costs has become the mantra of todays' insurance companies. IT seems to take a hit every time. In a lot of companies costs are slashed, staff eliminated and those left behind have to deal with the fallout. This company was similar, they've had recent staff reductions. But the remaining staff saw an opportunity. They realized the ways of doing business were changing and they needed to look at new and innovative ways to save their company. Yeah, even though they lost friends during the last round of layoffs, they still considered the company a great place to work and wanted to help as much as possible. So that's the background.

In most companies I've been to the first, or nearly the first, sentence out of anyones mouth is "It'll never run off the mainframe" or "You can't run that on a PC". You know the first words I heard from these people? It was "Cool, what do you need to make it happen?". Attitude. They had a positive attitude and were looking for ways to not only strengthen what they currently had, but expand what they were doing to make it all work better together. Better integration. Better performance. Better code analysis. Better production runs. They saw this as an opportunity to better not only themselves, but their company.

I can't relate how refreshing it was to see this attitude. A lot of people would've been pessimistic. They were optimistic. During the engagement we exchanged ideas, worked out solutions, did a lot of 'what ifing' and came up with solutions. I'm not saying all other accounts I've visisted have been depressing or not helpful, but there was an electricity at this account. Apprehensive? Yes, but optimistic as well. All about the attitude we convey to others.

Now you may ask "what has this got to do anything"? Really not a lot, and really a whole lot. On the surface it's a feel good article about someone and someplace most of us will never go to. Deep down though its about the attitude we bring to our daily jobs and the ability to see outside of our defined box and want to expand it and ourselves. Look outside your box, look for opportunity to grow and expand yourself and your capabilities. You may be surprised at what you can do!

Sunday, February 7, 2010

To Emulate or Not to Emulate - That is the Question

Some great comments on the last posting.  Thanks everyone for your input!

(emulation at it's best)

As you may have guessed from the title, I'm going to see if I can tackle the emulation comment.  I've never really thought of the Micro Focus Mainframe Subsystem (MSS) as an emulation environment.  But then I went out to Dictionary.com.  Hmmm...seems like maybe I'm wrong to some degree if you follow that definition.  One part of the definition fits nicely "When one system performs in exactly the same way as another..", but the rest pointed out an opinion which I've always assocated with emulation, "...though perhaps not at the same speed."

My thinking has been that an emulation environment involves a "wrapper" around the application, filtering the API calls and executing lower level calls to OS specific functions to perform comparable tasks.  This means extra steps which equates to slower performance.

After additional pondering (no Rick, I didn't fall asleep!), the key component which slows things down is the wrapper layer.  However, from my use of the Micro Focus tools, I don't see the wrapper or the penalty such a wrapper would include existing.  What I see are direct calls to platform specific API's which provide the same the same functionality as on the mainframe.  The difference is that those routines are executing OS specific versions of those routines.

Maybe I'm nuts, but to me this seems a bit different.  What you end up with is something which doesn't necessarily pay the same performance price as you would see with that "wrapper layer" or shell. A still calls B, not A calls B which calls C. 

The MSS is emulating the functionality found on the z/OS environment, but I don't believe it is acting as a shell environment within which the application runs.  The statements such as EXEC CICS and READ / WRITE are calling API's which are compiled to run natively on the platform, whether UNIX, Linux or Windows.  For instance, the read statement is passed to the Micro Focus file handler which executes the appropriate lower level funtion to access the indexed file on that platform.  And if you have the need, you can even replace the file handler with your own.  The MSS is Micro Focus' own engine providing the same functionality found on z/OS for things like JES or CICS or IMS but on different operating systems.

To give you some perspective, I recently had the opportunity (during the snow storm in New York City in December - Thanks Rob! *grin*)  to do a performance test for a prospective customer.  The prospect gave us a z/OS batch routine (JCL, COBOL, VSAM)  which proccessed just under eleven GB of EBCDIC data (one input file).  On the z10 z/OS mainframe this job with this data ran in two minutes and thirty seconds of CPU time and created fifty-six separate GDG output files.  

After setting up the tools in a Windows Server 2008 R2 virtual machine managed by Hyper-V on a dual CPU quad-core HP Blade with a high speed SAN, we were able to recompile the program and run the job with no changes to either the program or the data file (we kept it EBCDIC).  In five runs, we averaged 1 minute and 20 seconds.  Is this typical?  Nah.  But is it unusual?  Nah.

If this application execution enviroinment were wrapped in an emulation layer, I just can't see how it could be faster.  It would have more to do to perform the same work and would therefore be slower.

So, yes, it emulates the functionality of the environment, but it doesn't do it by running in an emulation environment.  Almost a Yogi statement there huh?

There are a few folks who follow this blog who might be able to add their two cents to this.  Maybe they will post some thoughts.

Anyone?

Sunday, January 31, 2010

Saving the Baby

I know why companies are abandoning COBOL.

"Why is this Robert?", you inquire. Well, have a seat there and let me e'splain.

As many already know, the most expensive item in the data center is usually the mainframe and everything that goes into keeping it alive and well, processing the millions of rows of data and transactions each week. If asked what sound most exemplifies their data center, the CIO probably wouldn't pick the sound of a Ferrari. Or the sound of an ever reliable diesel engine. Nope. Not even close.

It would most likely be the sound of a vacuum cleaner going full tilt trying to inhale that sock someone left under the edge of the couch. As far as the CFO is concerned, these environments suck the money right out of the company coffers. Keeping the big iron up and running requires some serious funds.

These “enterprise platforms” allow those same companies to realize the profits the shareholders care so much about. Without these enterprise class systems created around the mainframe platform, there would be no company. A catch-22 it would seem.

The problem is that many people confuse the platform with the systems which run on them. And the largest portion of these applications are written in COBOL. So, mainframe = bad and COBOL = mainframe. Then many confuse the two and come to the conclusion that COBOL = bad.

A common theme I've heard over the years is to replace the platform AND the applications. This is how folks like SAP acquired such a toe-hold in many corporations. They sold CIO's on the idea that to reduce costs in the long term you had to do both. Bang, that COBOL shop goes the way of the dodo.

But I have yet to hear about a single company who implemented SAP or any other package on time and under budget. How these packages keep getting sold as replacements to the tried and true COBOL applications is beyond me.

Others have been pitching that they can rewrite these massive processing engines in another language on a lower cost platform quickly. And the initial prototypes seem to indicate this can indeed be done quickly and at a low cost. An uninformed decision is made and another COBOL shop bites the dust.

On a rewrite, what usually ends up happening is that the first prototype works well, but when they start trying to build a complete system, reality sets in. Many find it is very hard to recreate something in the projected couple of months which took 20 years to write. Going with a package or rewriting everything costs more than most folks realize. But these aren't the only options and COBOL doesn't have to necessarily die to help cut costs.

Companies can swap out the platform without swapping out the applications. The COBOL, PL/1, CICS, DB2, IMS, JCL applications can run just fine on platforms such as Windows Server 2008 R2. No need to throw out the baby with the bath water! 20 years ago, you had little choice. Nowadays, companies have options.

You've seen some of the posts by Alex, Rick, and myself about the “new” capabilities of the environment. In my next few posts, I'll discuss some of the features which allow you to move your existing mainframe COBOL applications off without rewriting them. Let me know if there is something particular you want me to focus on. Stay tuned!

Monday, January 18, 2010

Love COBOL - Then Shout About It

People don't search the web about COBOL much - why? Because few people publish anything about COBOL - let's fix this!

I have been busy posting to 'The Code Project' a few articles on COBOL over the last two months:



The Code Project is a Windows centric community code site. The interesting thing is that (remember that these have only been up for a few weeks) these posts have attracted over 7000 (yes - seven thousand) views!

If you love COBOL, write posts. People are interested. Post them on blogs, post them on code sharing sites. Even send them to me and I will post them (for details - just comment on this post).