Pages

Monday, 5 November 2012

COBOL : A survival in the battle of the best


Phoenician is a dead language. Mayan is a dead language. Latin is a dead language. What makes these languages dead is the fact that no one speaks them anymore. COBOL is NOT a dead language, and despite pontifications that come down to us from the ivory towers of academia, it isn’t even on life support.What made those other languages die is the fact that they became obsolete. As the peoples that spoke them were overrun or superseded by other populations that eventually replaced them, no one saw any need to speak their languages. There was no good reason to keep on speaking a language whose creators had become irrelevant to history.


COBOL is different. Certainly, there were more people that “spoke” COBOL back in the 1980s than there are now. Remember, however, the second word in COBOL’s acronym – business. Businesses are complex social and economic organisms that exist for but a single purpose – to make money. One of the approaches businesses take to satisfy that all-important survival trait is that they want to avoid expenses. This avoidance of expense turns out to have been key to the survival of COBOL because those programmers of the 1980s (give or take a decade) were very busy programmers. Estimates are that as many as a several hundred billion lines of COBOL code were written for businesses world-wide. Because of the first word in COBOL’s name (“Common”), as businesses replaced their older, slower and less-reliable computer systems with newer, faster and more-reliable ones, they found that the massive investment they had in their COBOL software inventory paid dividends by remaining functional on those new systems - many times with no changes needed whatsoever!

Unwilling to endorse change merely for the sake of change, businesses replaced these billions and billions of lines of COBOL code only when absolutely necessary and only when financially justifiable. That justification appeared to have come as the 20th century was nearing the end.

Written long before the end of the century was near, many of those COBOL applications used 2-digit years instead of four digit years because, when the programs were written, computer storage of any kind was expensive. Why should millions and millions of bytes of storage be wasted by all those “19” sequences when the software can simply assume them? Since their software would suddenly think the current year was “1900” after the stroke of midnight, December 31st 1999, businesses knew they were going to have to do something about the “Y2K” (programmer “geek speak” for “Year 2000”) problem.

At last! Y2K was going to be the massive asteroid strike that finally killed off the COBOL dinosaur.
Unfortunately for those seeking the extinction of COBOL, that proved to be wishful thinking.Always concerned with the bottom line, businesses actually analyzed the problems with their programs. Many applications were replaced with newer and “better” versions that used more appropriate (translation: more politically correct) languages and computer systems. BUT … many applications were not replaced. These were the absolutely essential applications whose replacement would cripple the business if everything didn’t go absolutely perfectly. These COBOL applications were modified to use 4-digit years instead of 2-digit ones. At the same time, many of them received cosmetic “face lifts” to make their computer/human interfaces more acceptable, frequently with the help of modules developed in the newer languages.


The result is that even today, after the Y2K “extinction event”, there are, by industry estimates, over 40 billion lines of COBOL code still running the businesses of the 21st century. A fact that is disturbing to some is that – just as tiny little furry mammals evolved to cope with the original “extinction event” holocaust – COBOL has also evolved into a leaner and meaner “animal” capable of competing in niches and providing services unthought-of back in 1968. That fact is confirmed by the fact that those lines of COBOL code being tracked by industry analysts are actually growing at the rate of about 4 billion a year.



Evolution, you see, is in COBOLs DNA. Over time, COBOL evolved in form and function, first via work done by the American National Standards Institute (ANSI) and eventually through the efforts of the International Standards Organization (ISO).



The first widely-adopted standard for COBOL was published by ANSI in 19682. Named the ANS68 standard, this version of COBOL was originally standardized for use primarily as the business programming tool of the US Defense Department; it quickly was adopted by other Government agencies and private businesses alike.

Subsequent standards published in 1974 and 1985 (ANS74 and ANS85, respectively) added new features and evolved the language toward adoption of the programmer-productivity tool of the time – “Structured Programming”.

As the 21st century dawned, programming had moved out of the board room and into the Game Room, the Living Room and even the Kitchen; as computers became more and more inexpensive they appeared in games, entertainment devices and appliances. Even the automobile became home to computers galore. These computers need software, and that software is written in the so-called “modern” languages.

Combined with Y2K, these trends became the impetus for COBOL to evolve even newer features and capabilities. The COBOL2002 standard3 introduced object-oriented features and syntax that make the language more programmer-friendly to those trained by today’s programming curricula. The COBOL20xx standard, currently under development, carries the evolution forward to the point where a COBOL20xx implementation will be fully as “modern” as any other programming language.


Through all this evolution, however, care was taken with each new standard to protect the investment businesses (or anyone, for that matter) had in COBOL software. Generally, a new COBOL standard – once implemented and adopted by a business - required minimal, if any, changes to upgrade existing applications. When changes were necessary, those changes could frequently be made using tools that mechanically upgraded entire libraries of source code with little or no need for human intervention.

The OpenCOBOL implementation of the COBOL language supports virtually the entire ANS85 standard as well as some significant features of the COBOL2002 standard, although the truly object-oriented features are not there (yet).

Sunday, 4 November 2012

You know a programming language? I know,that's not COBOL!


Nerdy cobol funny t-shirt


       If you already know a programming language, and that language isn’t COBOL, chances are that language is Java, C or C++. You will find COBOL a much different programming language than those – sometimes those differences are a good thing and sometimes they aren’t. The thing to remember about COBOL is this – it was designed to solve business problems. 



It was designed to do that in the 1950s. COBOL was the first programming language to become standardized such that a COBOL program written on computer “A” made by company “X” would be able to be compiled and executed on computer “B” made by company “Y”. This may not seem like such a big deal today, but it was a radical departure from all programming languages that came before it and even many that came after it. The name “COBOL” actually says it all – COBOL is an acronym that stands for “COmmon Business Oriented Language”. Note the fact that the word “common” comes before all others. The word “business” is a close second. Therein lies the key to COBOL’s success.

A technology hidden to the common man




Despite the predominance of mainframes in the business world, these machines are largely invisible to the general public, the academic community, and indeed many experienced IT professionals. Instead, other forms of computing attrct more attention, at least in terms of visibility and public awareness. That this is so is perhaps not surprising. After all, who among us needs direct access to a mainframe? And, if we did, where would we find one to access? The truth, however, is that we are all mainframe users, whether we realize it or not (more on this later).





Most of us with some personal computer (PC) literacy and sufficient funds can purchase a notebook computer and quickly put it to good use by running software, browsing websites, and perhaps even writing papers for college professors to grade. With somewhat greater effort and technical prowess, we can delve more deeply into the various facilities of a typical Intel®-based workstation and learn its capabilities through direct, hands-on experience, with or without help from any of a multitude of readily available information sources in print or on the web.




Mainframes, however, tend to be hidden from the public eye. They do their jobs dependably (indeed, with almost total reliability) and are highly resistant to most forms of insidious abuse that afflict PCs, such as email-borne viruses and trojan horses. By performing stably, quietly, and with negligible downtime, mainframes are the example by which all other computers are judged. But at the same time, this lack of attention tends to allow them to fade into the background. Furthermore, in a typical customer installation, the mainframe shares space with many other hardware devices: external storage devices, hardware network routers, channel controllers, and automated tape library “robots,” to name a few. The mainframe is physically no larger than many of these devices and generally does not stand out from the crowd of peripheral devices. There are different classes of  ainframe to meet diverse needs of customers. The mainframe can grow in capacity as businesses grow. So, how can we explore the mainframe’s capabilities in the real world? How can we learn to interact with the mainframe, learn its capabilities, and understand its importance to the business world? Major corporations are eager to hire new mainframe professionals, but there is a catch: some previous experience would help. 

Saturday, 3 November 2012

What is special about mainframes?


MVS(Multiple Virtual Storage) is the primary operating system on the IBM 370 series of Mainframes. (You may hear people use various initials and four-digit numbers when referring to IBM mainframes, such as 3033, 3090 or ES9000, but they are all considered hardware models of the 370 series.) MVS is the eighteen-wheeler of operating systems. People don't use it for flash and speed; they use it to bear large, heavy loads steadily and dependably.

When we talk about the tremendous processing power of a mainframe running MVS, we're talking about a power different from that of supercomputers. Supercomputers do complicated calculations at very high speeds. Designing them for the best possible calculation speed often means sacrificing I/O (input/output) speed; the scientists who use them are more likely to give them a complex math problem and say "grind away at this equation all night" than they are to say "read in these 300,000 records of data, do 8 calculations on each, and then output 300,000 separate reports." Reading and writing a tremendous amount of data and doing relatively simple calculations with it (for example, calculating interest payments, as opposed to calculating a boat hull's optimum shape) is the province of mainframes running MVS. An insurance company keeping track of its accounts, a chain of stores keeping track of its inventory, or any large company keeping track of its employees and payroll would use MVS. Because it's a multi-user operating system, MVS lets
many different users use the same programs and data at once.

Personal computer users like to make fun of big computers running MVS, calling them "dinosaurs." While the interface may seem primitive, MVS has had many features since its introduction in 1974 that people are only now trying to shoehorn into the operating systems that control personal computer networks. MVS includes built-in recovery routines for dealing with faulty hardware like tape drives or even (in a multi-processor environment) faulty processors. A system running MVS can support thousands of users at once. The security of one user's data against tampering by others is an integral part of the system, designed into it from the ground up. (How often do you hear of a virus or a worm breaking into an MVS system?) The primitive interface isn't the only thing that give people the wrong idea about MVS. A given MVS installation is highly customizable, and so is the way that each user uses it. Many different parameters can be set when doing virtually anything, and MVS doesn't always have the default settings that we take for granted on other systems. The most efficient settings are left to individual system administrators to figure out. Since many settings and details are site-specific, a new user on a particular system—no matter how much MVS experience he or she brings to that system can't be expected to know the best way to approach that system. 

Thursday, 1 November 2012

Hi all

I am back to blogging after 3 years.Lets see what happens with me...I always open blogs so that it enhances my tech skills.So that I can express myself technically on the field in which i am working.