One of my first jobs was working on decommissioning mainframe applications for the space program. While much of the code (Adabase/Natural) was written in the early 80's, a lot of the people involved were still working when I was there, so it was common to ask questions of the original authors when looking at reports or batch processes. One time, I asked my boss if we could set up a meeting with a specific person, and he asked me to look out the window and said "Do you see that big tree out there? That's their memorial tree."
I wrote COBOL programs 45-50 years ago. The "dot" at the end of a line is a period. It was a well known error we called a "pregnant program" - one missing the period.
It's called a full stop in Australia! We had a discussion in one of my programming classes on what programming syntaxes are called in the US and the rest of the world and there are quite a few differences.
Have you ever heard the Y2K joke? A COBOL programmer, tired of all the extra work and chaos caused by the impending Y2K bug, decides to have himself cryogenically frozen for a year so he can skip all of it. He gets himself frozen, and eventually is woken up when several scientists open his cryo-pod. "Did I sleep through Y2K? Is it the year 2000?", he asks. The scientists nervously look at each other. Finally, one of them says "Actually, it's the year 9999. We hear you know COBOL."
Dude, this stuff is so legacy, the entire first TWO generations that maintained it are dead. The 3rd generation that picked it up has either retired, or IS retiring in the next couple of years. I had my shot at going into COBOL early on in my career, and NOPED out. It was old/creaky and nobody wanted to F with it back then, and that was 35 years ago... What was a little SQL and a code snippet in Foxpro v2 to generate a report, was PAGES of code in COBOL.
COBOL programmer positions should be passed from parents to children. They should have COBOL as an integrated part of their family name too, like a religious Order. And the linked article is flagged as spam. WTH
Mainframe cobol programmer / architect signing in. I could share stories you people would not believe. Mainframe is one of a kind. I've seen systems up and running (uptime) for 14+ years. I've seen code running in production developed in 1982.. Mainframe is the powerhouse of the world without you knowing about it. AMEX, VISA, Mastercard. Anytime you swipe your card to pay, it goes thru a mainframe.
Mainframe support guy here. COBOL is robust and dependable and self-documenting. Why are legacy programs and systems around for decades? Because they did it right the first time.
@@thomaslink2685 "...they did it write the first time."? let's not get carried away - I'm pretty programming back then was just as iterative as it is today FWIW - I work at a bank and sit next to one of the guys that did their mainframe assembly application. He's told me a few stories. #respect
same here, oldest program I've worked on was from from 1993 meaning the code is older than me. And it's not just banks that run this stuff, every company using SAP is running into the same problem now, because SAP runs on a language similar to COBOL
I've seen things you people wouldn't believe. Batch jobs crashing into the void of unallocated memory. Hex dumps glittering in the green light of a sysadmin's console, waiting for code review. All those moments will be lost in time, like bits in registers. Time to log off.
For everyone in that chat commenting to "just rewrite it", they obviously have never seen that in the real world for a enterprise scale software suite. About 7-8 years ago a software company I was working for decided to entirely rewrite a popular product in a "modern" language, Java, rather than maintain it in C. This product handled automated file transfers. Pretty simple right? The old code was full of branches, corner cases, a "weird" one offs. The company fired the old programmer and gave the rewrite to a team of younger engineers. The release of the new version was a disaster. Each of those baffling little branches of the code was to handle some non conformant condition in integrated applications the customers were using. Banks handling thousands upon thousands of complicated transactions a second just cannot accept code that might melt down in some rare but persistent corner case. Downtime can literally cost a big enough company millions of dollars a minute.
this "just rewrite it" would end up like Windows 10 to 11 has been. Losing a lot of existing functionality and instead of reimplementing them, some ad functionality gets priority. Big enterprise scale software is comparable to OS codebase in size and complexity.
It can be done with the right people... but it will take very long and cost you a fortune. So this endeavor mostly only gets funded when a) there is no other way (e.g. you fucked up before and now its too late) --> fail b) you underestimate the time and effort needed by orders of magnitude --> fail c) you overestimate your abilities --> fail d) you rightly estimate it all, you have the funds, the people and time --> might work I have heard of some very rare cases where a rewrite did work. They had everything on their side and barely made it.
Best way can be strangler pattern. Don't rewrite the whole thing in one go, just put it behind a proxy that slowly becomes the new implementation, where you can do the most important and value adding bits first.
I worked at a bank and they were trying to move all their COBOL to Java. They did a few years of it, canned a bunch of COBOL devs. Then figured out that they couldn't even get close to the speed of COBOL in Java. They tried to hire back a bunch of those COBOL devs and started giving new hires out of college extra $$ to learn COBOL.
Not today, but this sounds like an amazing use case for Zig/Rust where rewritng the codebase in Zig would be fast as in less human hours and blazingly fast as in cpu cycles Edit: typo
Don't write a banking system in zig, maybe rust but not zig. It's pre 1.0 and I don't know what they're going to do about stability once they reach 1.0
I learned about the COBOL and the mainframe about 5 years into my first job as a Java/.NET/C developer. It's actually a sad story about the mainframe. The platform is fantastic; the Power8 and Power9 processors on the mainframe could run circles around the x86s running Linux, and the mainframe was doing things like LPARs, "containers" (though not called that) and VMs long before they were a thing on other systems. The biggest problems were IBM's "walled garden" (more like "Supermax Prison Garden") for third-party software, and most importantly, the absolutely atrocious interface. ISPF, MVS, TSO, and green-screen 80 character crap was the boat anchor that sunk the mainframe. If people understood what the mainframe could do, and IBM put a decent interface and decent automation into it without crazy menus and documentation you had to pay for, then developers would have begged to develop on the mainframe. It had all the functionality of the cloud years ago, it's just hidden behind stuff meant to emulate 1950's punch cards for backwards compatibility.
No, the biggest problem was total cost of ownership. It doesn't matter if one IBM mainframe was 2, 3, 4 times as productive as that x86 Linux system when you could run 20 Linux systems for significantly less.
I work in insurance, similar story there. Lots of COBOL, text-based applications from the 70's, and the like. About 15 years ago we stopped using actual punch cards and mainframes terminals are now emulated. Supposedly there is an old timer who lives in Fiji who we fly in on short notice any time something REALLY important breaks. Zero political will to rip it all out and rebuild. Trust me, people have tried.
It costs money and you cannot really market it to people. Customers don't really care if you use COBOL or Rust. From managers' point of view it's a system that works, migration costs money, it's way better to rely on that one guy in Fiji in case something breaks. And if the manager in question is evaluated based on saving money, this a clear "no" to any migration because it could affect the yearly bonus. And yes, the one time investment into migration could save money for maintenance in the future, but managers don't think that far ahead because they might not even be with the company at that time.
@@ThatAnnoyingGuyOnTheInternet Yeah but at some point the net present value of maintaining and old system is greater than the net present value of replacing it, or is outweighed by other business considerations and risks, like no one understanding key operational systems. But that is clearly a long term concern.
@@JGComments I have never seen mainframe migration which, at end, didn't cost more than original mainframe. They are priced accordingly to their tasks. Enterprise Oracle + Openshift + NAS still have both licence and running costs well into mainframe teritory. And with modern stacks you usually need more IT support people. That doesn't meen there is no value in migration. Latest migration I have seen was done purely because of future staffing.
@@bariole I agree that it would cost more. The primary issue is business risk of losing the ability to fix or maintain it, because for key systems you are putting large streams of revenue at risk which are much larger than the cost to rebuild, on a net present value basis.
My second job was at a bank, it was mainly COBOL, JCL, and Assembly for the first 1.5 years. Coming from Java it took me quite awhile to get the hang of things. Almost all of our data was stored in flat files, so in order to read those you had to go look at the code to make sure you had the right header sizes, etc otherwise it was just binary garbage. I remember thinking how dumb this was at the time and why not use a DB but now having created custom binary formats for stuff it makes so much more sense. Also assembly code reviews, those guys printed out what looked like dictionary sized books of a single assembly module and spent hours reviewing it, still crazy to me.
1991: We must administer this IQ to test to see if you possess the aptitude for this career. 2024: Lost your job hanging drywall to Jose? Learn to code.
People learned that the IQ can be trained to some extend, or rather a low IQ is often a lack of such training and education. Sometimes the questions asked require a certain experience that people may or may not have had, too. We feel like we can get a simple answer on how intelligent someone is, but it's more complex than that in reality. The most common factor that used to be known was the kid who had afternoon classes, learned an instrument and joined a sports club vs. the kid that sat in front of the TV all day. While it's not exactly looking the same anymore, this factor of experience and training your brain remains unchanged.
@@actually_it_is_rocket_science I'm sure several people had mothers that worked on these types of systems. My mother learned programming mainframes at a bank when I was 11 or 12. I had a hard time getting my parents to buy a computer for home because she had gotten tired of dealing with them at work. I didn't get to use a computer at home until my dad brought home a Toshiba laptop home from work which had the new 3.5 inch floppy disk drives! ;-)
@@TheKennyWorld One line in the article was weird where he said that somebody like his mom must be making a lot of money. It's like he doesn't know how much his mom makes, which made me question the authenticity of that article (before I read this comment)
@@SandraWantsCoke Could just be regular old Swedish taboo around asking about peoples' salary. Although at least I'm pretty sure if I asked my mom it wouldn't be that much of a problem. That said I do not know my moms salary.
My first non intern job was COBOL programming because I had COBOL classes at the university. That was 2018. I graduated in 2018. I still like Cobol, I felt like a true hackerman working with ISPF
My company just helped a bank of similar scope to Nordea shift their entire banking core out of mainframe land (SDC). It can certainly be done. It was a gargantuan task, but nobody noticed they did it.
It's a dated language but the basics (variables / units of code / control structures) are there. The real issue is the domain specific knowledge, both in technical and in business / legal frameworks that codify it.
Thats a tough niche to take, even if you learn it its still 0 expirience in the field, nowhere to get it, and even if business needs it, demand isnt great so you cant switch jobs after that, you become trapped with a skills that looks cool for another programmer, but horrible for hr.
COBOL is where you write a verbose English dissertation about how you want your program to work and hope that it actually ends up working the way described.
One thing I find amazing about IBM and COBOL is the levels of backwards-compatibility, to the extent of using virtual tapes and disks to be compatible with physical ones. Pretty cool imo.
Well, not that cool, when you have to allocate a file size in terms of blocks, tracks and cylinders instead of bytes (cause they don't grow) and you have no idea how much bytes a block is or the geometry of the disk...
Yes - yuck. I started out in engineering on a mainframe that was TOS - a CDC 750/760 with a Tape Operating System. It was now disk drives, but you would still run your program, then have to 'rewind' the disk file to run it again. I can still probably type rewind,* faster than most other programmers.
It's been about 30 years since I touched an AS/400, but the terminal text editor assumed program statements may potentially be written to punch cards. There are reserved columns for line numbers, limiting the actual editor width to about 65-70 characters. All programs had to be formatted to fit this layout.
One of my coworkers father is a cobol programmer. Literally the youngest one at 65. They cant find anyone to replace them now that they are all retiring
@@lexmercatoria2774 I'm in the US, and at 54 I'm one of the youngest I know that even knows COBOL, and actually had classes in College that aren't taught anymore in the US.
I began coding COBOL (and assembler) in 1979 in a bank (sort of). Was IBM Mainframe, current name is IBM System z. I began recently to work again in my old workplace (as a consult). They have tried to kill both COBOL and mainframes since the early 1980-ties. It has never succeded.
8:30 the age is immensely valuable there just bc of how many decades they have before retirement in a field where almost everyone is 50-60+ as discussed earlier,, as a cobol manager, it’d be pretty appealing to have a 20yo employee w the same domain knowledge and it’d be smart to incentivize them to spend their whole career at yr org
feel like prime missed the point there and took it as saying that being 20 makes someone a better programmer which i don’t think at all was the point lol chat got it tho i guess
If institutions want young COBOL programmers, they need to pay competitive rates. COBOL programmers are at the low-end of the salary range in the US. Below many other languages that have been around for a while, including some surprising ones (e.g. Perl). The languages that tend to be mostly senior programmers tend to have pretty high average salaries, but not COBOL. Every few years there's a bunch of articles claiming COBOL programmers are an impossible to find dying breed and super in-demand, and every time I see those articles, I check job listings...and every time I see that it's BS. It's always low wages, whenever employers say they can't hire people to do a job. I think it'd be fun to wrangle big iron, but not if I have to take a big pay cut AND deal with stodgy banking/finance/government culture and workplace.
@@joecooper1703the problem is COBOL programmers don’t really bring in new cash flow or new addons the sales team can offer to clients. They’re just there to keep the lights on and the Banks running within regulation at this point. So they get treated like IT Security in that the execs won’t see the value in the investment until it bites them in the ass.
I'm one of those from the "Deep Past" that worked on this type of stuff starting in the 80's. We referred to the database as IMS DL/I. Used it at a pneumatic tool company for part explosion of a tool. As for GSAM its a Generalized Sequential Access Method that would have been pronounced "gee-sam" and not "gasm". I don't recall working with that specifically, but I do recall working with ISAM and VSAM files. My first job was with a medical billing company. Through attrition I was responsible for thousands of lines of code and at one point on-call 24/7. That got old real fast. As for ISPF, Prime, I was probably as good at that as you are on Vim! Our monitors had one color - green and the keyboard was a clacky, metal IBM one that you could kill somebody with. Oh and we sat in cubes all day in suits. Today's developers piss and moan about their work conditions - well LOL to them!
Uhm except that working in a cubicle wearing a suit sounds like the dream. These modern open plan startup spaces are all about cramming as many people as possible into a small space and making it as easy as possible for management etc to come and disturb you while you're trying to work so that they can feel more important. I work remote these days but back before covid I would have done anything for my own cubicle.
Like yourself I wrote COBOL all throughout the 80’s. Myself along with 2 other equally skilled IT members coded an entire online / interactive MRP system for a machinery manufacturing company. This ran until that monster “Year 2000” scam. We had the system that we coded in the 80’s already 2000 compliant. But management was convinced they needed some package software to survive the 2000 dates. Was the safest time of my life when we were assigned to trash our code for that canned software. Our code would still be running today if management would have had the balls to believe us ☹️
GSAM files are used when writing sequential files from some type of online applications. They are used because their updates can be automatically backed out by the system unlike traditional files.
You just drew some money out of a cashpoint? Probably an IBM Z16 running COBOL. These things are everywhere and have 99.999999999% (yes 8 nines!) reliability and are absolutely state of the art (makes x86 stuff look like a commodore PET!). IBM Z16s can run programs from the old System/360 from the 1960s and System/370 from the 1970s unchanged. COBOL is amazing at data in/out as it memory maps the files in and out with ease. It's all built in. It's insanely fast!
I can't tell you how many times I had to get on a System/390 and run HX to kill off COBOL programs that did infinite loops dragging down the mainframe, I was just hoping that mainframe was separate from the Uni's production workloads.
@@MorningNapalmThe IBM z16 is a server. The Cashpoint clients can run many different kinds of hardware and software. As long as they implement the correct API for the financial service providing access to the banks. As in most cases businesses buy the equivalent of a proxy service which will interact with all the different bank specific systems behind the scenes.
Had a story from my uncle who works at a bank. A new IT employee shut down the Mainframe (don't know exactly how). They had to call people out of retirement to fix the aforementioned person's mistake
I worked on mainframe systems starting in the early 80's. I once did somethings that seemed to crash our mainframe. I talked to tech support and they said that was impossible. I showed them what I did and it crashed again. They just told me not to do that anymore.................
Btw. one of the favourite stories my father used to tell (he was an IBM SE, so he basically sold mainframes and the system software to run on them) is when he was asked to benchmark the at that time newest IBM mainframe against the offering from Amdahl. What he needed was a program to chew CPU, to demonstrate task switch speed. So he basically wrote one, in assembler. The simplest assembler program anybody could write, just a jump (sorry, they call it "branch") to the instruction itself. What he didn't realize is that while IBM hardware checked for interrupts before executing the branch, the Amdahl hardware didn't check for it until the hardware wasn't executing a branch instruction anymore. But with the next instruction being, yes, a branch, that checking for interrupts never occurred. So he basically hung the machine with this simple program. The solution was, of course, to just add a no-op before the branch, and jump to that. Oh, and btw. ISPF is cool (albeit scarey if you have the mainframe at your fingertips). I liked it so much, I used the PC version (called SPFPC) for part of my first programming job (Turbo Pascal + a dbase-compiler called Clipper, and the code for the latter is what I used it for). Btw. ISPF stands for "Interactive System Programming Facility". IBM had quite a knack of calling stuff very basic things, their main programming language was PL/I (literally "programming language number one" :P), disks were called DASD ("direct access storage device") etc. pp.
Although for a short time at the beginning of my career (1999), I used ISPF and shortcuts like i.3.4 or i.2 are still in my muscle memory. The text editor was great too: only 80x25 characters but with syntax highlighting and neat copy/paste system as well
My first job out of college was to black box migrate applications from the IBM 4361 (VM/370) to the brand new IBM PC (developed using dBase IV and compiled with Clipper). One salary app used for proposing annual pay step/grade increases gave me fits. I could not exactly duplicate the mainframe output calculations from the same test dataset. I finally quit trying code changes and prototyped the matrix in Lotus 1-2-3 (spreadsheet) so i could see changes to the calculated values instantly. It took about an hour but I found it. The original programmer rounded the calculated row values to two decimal positions but let the machine default to sixteen decimal positions for column calculations. When this discrepancy was presented to management the decision was made to carry the error forward into the new application for backward compatibility. 😂🤣😅
Started writing COBOL in 1978 on punch cards to run on an IBM mainframe. Retired 2013 after extensive career path IMS systems programmer, DB2 systems programmer, Unix administrator, Tech analyst, Performance analyst writing scripts/programs in COBOL, C, C++, Java, Visual Basic, C#, VB#, Perl, Excel VBA, SAS, against Oracle and Sybase databases and Unix/VMWare servers. Retired 10 years, coding mainly in VSCode Python with some R and Mathematica. With help from Microsoft Copilot, Meta Ai and Perplexity AI. against large public astronomical online databases. Never gets old. 👨💻👨💻👨💻
hahahaha, love this. did a year of vocational traning in cobol 10 years ago and work as a Cobol developer today as a 34 year old in sweden as well. my first system i worked in was 8 years older than me.
They were only badass when you had to fix their code. (Dirty little secret - a large percentage of legacy COBOL code was actually assembler 'translated' to COBOL so that the assembler programmers could keep their jobs. Most of them hated 'high level languages' and thought assembler was superior because assembler "gave them complete control over the computer".
My first job out of college a couple years ago I was hired at a consultancy. The first project I was placed in was for an old manufacturing company who needed their ERP system modernized since the old one was written in COBOL and running on some ancient IBM mainframe, also the guy who had written 90% of it had retired long since and nobody fully understood all the logic that was integral to the functioning of the company. Had to spend quite a bit of time combing through old COBOL code trying to understand what it did so I could reimplement the same logic in the new system, never again.
@@mattymattffs ya that's what we tried telling the customer, but they didn't understand how it worked and wanted the new one to work the exact same as the old one. They even wanted me to replicate arithmetic errors caused by truncating decimals in intermediate steps of a calculation. And you know what they say, the customer is always right.
So I'm a young-ish hire at IBM and I'm learning all of these technologies. IMS, DB2, ISPF, this is all just another staff meeting for me. It's pretty cool to hear an "outsider" hearing about it for the first time and offering their thoughts. I don't do COBOL, but I do some programming in REXX, which you could sort of compare to Bash for the Mainframe.
In the 1980s, I worked in banking, but tried to avoid COBOL. I was programing in TAL (Transaction Application Language) on parallel processor, fault tolerant Tandem Non-Stop systems using relational databases. They were acting as the front end for ATM and wire transfer networks for IBM mainframes. There was an interface on the Tandem system that made it look like a tape drive to the mainframe. Every night, a batch job would run to update the account information (balances, etc.) on the Tandem ATM system, so I was tasked with development on both the Tandem, and on the mainframe in COBOL. Also, the Tandem used a client/server environment where the client programs for the terminals were written in SCOBOL, a version of COBOL (Screen COBOL). There was nothing quite as irritating as submitting a batch job on the mainframe to test on the interface and waiting for the job to execute just to find out that you missed a period.
My dad is a now retired Cobal programmer for a large insurance institution. These stories had a very familiar tone to the ones I heard growing up. I sent him the video and am seeing what he thinks of it.
My mom programmed in COBOL since the mid 1970s. I learned about computer code by reading the stacks of greenbar she would bring home with code listings on it generated after they input thousands of punch cards. She retired in the mid-90s and took on a much less stressful job at a local museum instead. She, however, stoked an interest in computers and programming into me. I was a software dev for well over twenty-five years and now I work in the appsec world. Of all the stress I have had in my career, none of it would match the stuff she had to go through in hers. 24/7 on call along with having to make everything write the first time because it would take hours to fix even a small mistake and resubmit a job.
Started coding in COBOL in 1980 and still am. Sometimes knowing legacy systems is the pathway to job security. :-) I hope to retire next year. I'm tired.
I got my first job at IBM as Mainframe batch processing operator and I have to say it was experience like no other. We were working for a large bank in Poland and I do remember some of those things which article is about like batch processing, transactions etc. There were situations where batch processing (at night) was prolonged, e.g. due to the size of the processed data or some error, and as a result, certain jobs that acted like a "cut off" of the day (which normally should have been run at night when most stores were closed) were run in the morning when the stores were already operating, which led to a temporary (approx. 15-20 minutes) blocking of their payment terminals. Fortunately, I haven't worked this way for over 20 years or more like others but I'm so grateful that I had the opportunity to be able to work with such people and see what it's like. And the responsibility is huge when it comes to banking and money-related operations.
I remember our operators leaving the newbie in charge while they piled off to the pub for a birthday celebration ( oh, the good old days); all he had to do was feed the printer and call the pub (across the street) if a problem came up. They returned to find there was a logical recursion loop in the job execution sequence, four hours of duplicated reports :D
It would be fun if you'd actually write and show some modern COBOL. I hear the modern stuff (that runs on VMs, I guess) isn't even bad. It's just a DSL for business. The first-ever, even!
This is a great article. Loved your read of it. My first paid programming job was in 1991 (Turbo Pascal 5.5, pre-Windows). This was for a very large 'Britiain-based Telecommunication' company. We had to call statistical library code in MSDOS Fortran. The biggest change in the last 30+ years, to me, is the editing and compiling tools. The conveniences that Sublime Text or VS Code (even VS) bring would have been incredible to 21-year-old me - far more than any graphical, speed or language change. Languages change, algorithms wax and wane, libraries and APIs come and go. But back in the day we often had no easy way to read one file while editing another, and build times could be measured in tens of minutes (it was easy to forget what you were tracking down and trying to deal with, when you had an enforced 10 minute break before you could check the outcome of your tweak). Your (physical) notebook was your friend. The official documentation was not a PDF or a website but instead in dead-tree format (often excellently-written).
I'm a programmer in a bank, something you struggle with when you start (I did as well) is how much process and gates there are to getting things done. I can have a fix ready in a couple of hours, but it can take 3-5days to be deployed going through deployment processes. This isn't because we like process, but because we have regulatory obligations, segregation of duties constraints, risk and security controls, and more importantly we are dealing with money and sensitive PII, a transaction screwing up and somebody not getting paid has big real world impacts on people. Failures of the platform I work on can cost us a lot as it's used for regulatory reporting, anti money laundering and fraud monitoring. We still have mainframe systems and handle EBCDIC encoded data produced by them.
I've worked in a shop with both mainframe and PC development and the research studies I've seen at many other larger companies show that many migrations cost millions of dollars and take years to complete. I've even read about several that cost millions and took years and at the end of the project, it was too incomplete/buggy to proceed so they scraped it. It is indeed NOT a trivial task. I'm not a mainframe advocate but that platform is designed for high throughput and heavy I/O and is very stable. Of course it comes with its own set of drawbacks and limitations but that is true of everything in life. Use the best tool for the job.
A company I worked for in the 2000's had a large codebase written in Honeywell assembler in the 1970's. After a few tries at migrating the code to something modern without any success (no one knew everything about everything in the system) and the Honeywell hardware it ran on was no longer available, it was decided to write an emulator in "C" to run the code on IBM hardware running AIX. Because I had been a tech for Honeywell I was tasked to explain all the wierdness of the H2000 instruction to the youngish engineers who were writing the emulator then figure all the tricks the original programmers had done that didn't exactly conform to the way the instruction set was described in the manuals. When everything was finally running it outdid the original hardware by a large factor (The original Honeywell hardware ran at 4 MHZ). As far as I know that system is still in production.
When I started as a mainframe programmer back in the early 80's one of the first jobs I had was to work on a team that was replacing an Assembler system written in the early 60's. In order for that system to run nightly they had to IPL (initial program load - reboot) the IBM mainframe and run an emulator just for that system and then re-IPL when it was done. Operations was so happy when we finally retired that system.
You should talk to Veronica Explains (@VeronicaExplains). She is a COBOL dev (and also sys admin?) and has a great channel. Getting her perspective would be super interesting.
One of my university courses was making a payroll system using cool and power house. It was over 300 pages when I compiled and printed it. I was the only person in the class to get 0 errors
Yeah. I'm in the position of maintaining about a million lines of code written in an ancient compiler. I actually wrote most of it in the past twenty-five.years. It just happens to be a mission-critical program for oil refineries around the globe.
When you understand a language, you also understands where it shines the most. For banking Cobol was perfect because it handles transactions very well. However, the last time i was working with it, i understood that at that time, the transactions were done at night and in a couple of mainframes.
Can agree about the statement on rewrites. I'm working for a saas that has existed for about 10 years now, written in PHP and react originally and in the process of rewriting for the last 2+ years at this point... We're still not at the agreed "feature parity"(which would skip implementing some features in the new version), despite having more than twice the number of devs the old system saw in its peak(4 vs 9). Even though I've been here for 9 years, I didn't foresee how big of a waste of time it would be to rewrite everything(including backend) from scratch... Don't do it folks, just don't do it 😂
42 years in IT and almost all in COBOL development. Worked for a bank, at a large corporate office and for the last 28 years for a company that had a credit card processing application that ran on an IBM mainframe. One of my primary tasks was developing API transactions that allowed mid-range applications to use mainframe transactions. We use MQ Series communications which is an IBM product that allows any remote application to communicate with the IBM mainframe. It was highly successful. We used it to develop web based customer service applications, and customer facing web sites. This allowed internal users and customers to use web sites to access what they needed without interfacing to mainframe screens. Basically, we put a pretty face on the mainframe but all the real core processing still happened on the mainframe. Our mainframe system had approximately 6,000,000 lines of COBOL and completely rewriting it would be extremely high cost and high risk.
Around 1990 the mantra was 'the mainframe's dead' and 'paper's obsolete. IBM and wood-pulp stock tanked. I failed to secure my 'furniture' loan and so missed that boat completely :D
Worked in a life insurance and pension company with mainframe code migration in 2012. Looked into the Cobol code and found a comment in a module: Feb. 2008, This code has year 2000 bug checked. This code was designed to run when the developers probably has died 😮
Ive only poked around COBOL out of curiosity but... I kinda unironically loved it. I do need to do a bunch more (and do have a couple of project ideas for getting deeper in to it) I dont yet know it nearly well enough to judge how much Id like it in an actual project, let alone a giant legacy one doing essential jobs, but it does interest me.
IMS is hierarchical database built on a top of mainframe file system. Mainframe has filesystem which has more in common with ISAM database layer, than a modern disk filesystem like ntfs or ext. Mainframe files are of fixed size, with segment growth, and they are structured, and there is a transactional support for everything etc. There even exist "The ISAM" (a product) and it **was** developed on mainframe and then concepts of it were copy pasted to all other databases. DB2 is classic relational database, and on mainframe it is effectivly sql interpreter + indexing running on top of file system (remember it structured and transactional). IMS and DB2 on mainframe are like having two query/organizational engines on top of the same data layer.
Anyway it seems they are doing some batch bank clearing job (scheduled file upload) which is not very well organized. Like chosen data structure is not particuallary suitable for a task at hand, so their way of quering is to run lineary trough data, or maybe even with multiple passes. Hierarchical databases like IMS or LDAP are fast if you are quering hierarchy. If you are going against hierarchy, well it is like taking SQL, doing cartesian product of multiple tables and than quering that. 10 Tb of data is not a lot for even smallest mainframe. For revelance even smallest z16 mainframe, like one rack of computer, comes with terabytes of ram to begin with, and has cpu cache size well into gigabytes. I guess somebody designed something small 50 years ago, and now business around that program has grown to monumental scale. Now there are multiple programs doing god knows what around original data structure designed one Tuseday afternoon for a minor program which become widly usefull for particular period of time. That program is offline since early '80s but data structure persists. They could migrate, strangle the system, but as long as business side sees no value in migration everything is going to rest the same.
I am a mom who was a cobol programmer for 20 years, just recently shifted to a more mainframe system admin role. It's blazingly fast for transaction processing, nothing can match it. If it slows down under 60 milliseconds per transaction (via a CICS plex - CICS programs are written in COBOL) we look at where the issue might be. It slows down dramatically when it calls a non mainframe system for something, like currency conversion.
What an absolute nightmare. I've written a CNC control system entirely in Motorola 68000 Assembler, and 30 years on, the machines are still running without any software updates, hard drives or anything from the modern world. Every bit of that code is exactly as it was written in the final iteration and never changes. That's a huge problem if it has bugs, but when it works, at least you know that it will forever be the same. Embedded systems used to be a nightmare because we couldn't easily fix things in the field. Nowadays, you can sometimes update them over the air. However, it's still scary to know that millions of units are going to be out in the field, and you can't fix them if you've made a terrible error.
I left IT a few years after Y2K. Our divisional head hated our section. He thought mainframes and UNIX were a waste of money as Windows was the future. The first he got rid of was our manager. We worried about him because he was a COBOL programmer who spent more time looking after the payroll system than managing us tech support peeps. Yet he was never out of work once he left. He worked less; only about 8 months out of 12. He ended up earning twice as much money too. The other chap (from another section) they made redundant was a PC chap. He ended up working in a warehouse stacking shelves.
@@javajav3004 Our divisional head was a lunatic. It was as if he thought we were responsible for the company owning the mainframe. It was a COBOL application that prevented the mainframe from being switched off. They eventually cobbled together a 'system' to replace it. But it was a disaster. Luckily the company joined a much larger entity about 2 years later which saved its bacon. The obvious solution to us in tech support was for the company to have bought a nice AIX server and just move the application across on to it.
My dad worked with a man in the 1990s whose mother came out of retirement because she was a COBOL Queen.... and that was 30 years ago... the world still runs on it.
A “module” is a set of code (with usually one entry point and one exit (return) point) designed to perform a limited task. It would be “link-edited” with other “modules” prior to execution. There is usually a “control module” (which controls the logical flow of the program) which “calls” many sub-modules by passing parameters to it and the sub-module “returns” control to the control module, along with the result(s). For example many modules would be Lind-edited together (to form a program) and execution would be passed to the control module to calculate payroll. The payroll “control module” would pass a set of parameters to a sub-module responsible for calculating all state income taxes. The submodule would calculate the tax for whatever state requested using the parameters also passed, and return the calculated amount to the control module. The control module would then call subsequent submodules until all functions were accomplished. I am a Y2K “survivor.”
Mainframe Programs are written in a specific language (COBOL, ASM, Fortran, etc...). They are then individually turned into Object Modules (machine code) with specific characteristics through a Compile or Assembly process. Object Modules then go through a Link Editor or Binder process to create a Load Module with a specific Entry Point and characteristics. Load Modules are what mainframes actually execute (EXEC PGM=Specific Load Module) even if they contain only one actual user created Program (as Load Modules always contain multiple system supplied Object Modules as well). If you have multiple Object Modules working together then the Link Edit or Binder process would combine them into your created Load Module assuming they are Statically referenced. Any Programs that you want to Dynamically reference have to be turned into their own Load Modules and placed in an appropriate Load Library and normally will NOT be included in the original executed Load Module (as they will be loaded by the system when actually referenced). So Programs are written in a specific language and turned into Object Modules. Load Modules (always containing multiple Object Modules) are executed. Load Modules can contain hundreds of Object Modules combined together (from multiple different programming languages) and can reference other Load Modules (containing multiple Object Modules) Dynamically during the same execution.
I remember working with a small company, the government updated the API to handle tax information and transport manifests, the company lost a lot of money on the service downtime while the devs were scrambling to get everything up to date, the last thing that crosses your mind at that moment is a rewritte
Sometime around 1991, I created a scrollable listbox/combobox in Cobol on a Vax system that used VT 320 (they might have been 220) terminals. I created a fully functional text editor(modeled after some Unix editor that I forget the name of). These were monochrome monitors with no graphics cards of any type. There is nothing wrong with the Cobol language. The problem is in programmers that lack the necessary vision and imagination to create solutions to problems.
We are only a 100th (by GDP) the size of the US. But all the major banks where I live all moved to much closer to realtime, like if I transfer money to someone else's account at a different bank it'll take at most 15mins, normally significantly faster. This only became a thing in the last 5 years or so here. But the idea that banks can't change feels a little dumb, feels more likely that they don't wanna change.
I worked in banks in Mexico, non of them used COBOL, other Mexican banks do use COBOL but they were all replacing it with the new COBOLs, Java o C# part of the reason is that mainframes are way too expensive and only American or EU banks can afford them. So even the largest Mexican banks with COBOL mainframes are using second hand small mainframes and they keep using it because they are still freaking fast, a college professor told me like 10 years ago that their mainframe was still faster than their cluster.
"just rewrite it" - It'd be nice if we could have some sort of legal fine for people that misuse the word just. "Just fly to the moon", "Just secure it" "just get it done" "just figure it out". Rewriting something large and established is almost always a disaster. Maybe in the long run done well it could be a good idea but you'd take your problem of lacking people with legacy development experience to maintain an old system and turn it into the problem of lacking dramatically more of those same people who also are competent in whatever new language you want to use so that you can maintain the old system while writing a new one. The best approach is likely something that can interoperate with the old system so new stuff can be written in something more modern and old modules used or updated as required but I'm not sure that tooling for such a thing exists.
Well, I tend to agree, let’s play devils advocate for a minute. If you swap out pieces you are forever stuck with the original workflows, and business processes, when starting over might make a lot more sense. An overly basic idea, we replaced printers with terminals and 50 years later we are still stuck with CR LF interoperability issues that would never have happened if you’d started completely fresh with a variable sized semi-graphical terminal. But we replaced printers with digital versions and only later added new stuff. How many decades did it take to replace BIOS? In business this happens too, the old program’s capabilities and limitations created the business process, and now the business process (along with 1:1 interoperability with code on the old system) dictates how new code is developed. A fresh start is incredibly disruptive and often not the best choice, but replacing individual pieces without re-thinking the overall design is also very costly in the long run.
@@thedave1771 for a bank many business processes are set out in law and the complexity of them is why the COBOL code is so valuable. For other orgs changing business processes is a major undertaking for an organization and the software is very unlikely to be the limiting factor in an organization large enough to have their own custom software. A Fresh start for a business is a decision the business could make but it is very risky. Time and again we've seen with software projects total rewrite for large systems are total disasters. Remember KDE 3 -> KDE 4? Hell Python 2 -> 3 was insane and it really didn't change all that much.
I have been a COBOL programmer since 1987! My first was a Sperry Unicac S-80, my second IBM-AS/400, and for PC ( Motorola z-80, and DOS 3.2) was RM/COBOL. Now I am 60 Years Old! A living Legend.
"There's nothing wrong with Cobol" "Our bank was down 16 hours straight because someone forgot to add a dot, and we have no local dev environment to catch basic mistakes like these"
to be fair, that seems like an issue with how they have their development cycle set up, not with COBOL the language itself. missing a semicolon or parenthesis in a modern language will do the same thing if you didn't have fancy IDEs with squiggly red lines to warn you lol.
That's a problem with development practice, she even says later on that she isn't that worried about pushing code since they have a robust test environment. I wouldn't be surprised if the 16 hour downtime was the impetus for creating the test environment. It's not the language, it's the lack of testing that caused the downtime.
I was also born in '64. I had a course in cobol programming back in '84. Everything about that course was unpleasant: the IBM operating system, the line editor for editing our files, disk crash that lost our final assignments).
I felt a bit disillusioned at the end of my master's (not CS) and applied for a COBOL job since no Real Programmers want to go near it and the money is sick. The interviewer said they're looking for someone passionate about insurance, who can really dive deep into the field and stay for the long haul. That's when I realized what universe of boredom I was gazing into.
Yeah, I think that's the most unappealing part of COBOL. The problem is not the language itself. It might suck, it might be weird -- JS, PHP, and Rust are idiosyncratic, "weird" languages themselves. The problem is that you _know_ you're only valuable as an expert on a technology that is not yet dead -- even if everyone involved wants it to be. Your employer's incentive is to have you stay with them for as long as possible, hopefully until retirement, tackling the same types of problems from the same perspective in the same business domain. However, the longer you stay with that technology stack, the more you're missing out on newer stuff, and burying yourself deeper into the COBOL "stack" -- and _that specific business_' business model. If you're a run-of-the-mill "full stack" dev -- even if you're mostly focused on one specific thing, like front-end -- you can hop between wearing front-end, back-end, and even infrastructure hats. Think you've been way to long on the same project/team? You can easily switch that for another team -- or another job. Work with COBOL at a bank? Sure it pays well -- great even -- but that's _the_ thing you'll be doing for a loooooooooooooong time.
This is incredible. I had a short stint working contracting with Government entities. A lot of their systems were in IBM mainframe COBOL systems that we interfaced with integrations. I didn't do a ton with it but it was very interesting stuff and a major source of pain to some of my coworkers.
@@scottstempmail9045 I still remember laying out screens on graph paper and using a tool called mapcomp to generate the BMS macros for each transaction. The stuff I learned doing CICS served me pretty well when I started doing web programming with CGI since it's basically exactly the same programming model.
i learnt some COBOL in 1995 before learning C, not sure how close this was to mainframe COBOL but i enjoyed it. the Y2K "bug" had nothing on the problems we storing up for ourselves with COBOL under pinning the financial system. I am now a thor goblin using gamemaker for fun.
I know nothing about this subject but the word Cobol in the title caught my attention. I'm 67 and when I started college back in 1974 I found myself living in a dorm filled with engineering students. They were all absolutely obsessed with learning Cobol. There seemed to be endless Cobol classes and the students clearly considered learning it to be essential for their career success.
Was taught mainframes and cobol at my first job after getting my cs degree in 2002. The biggest thing I think this article missed was how incredibly strict their code standards are. Only a small portion of the training was learning cobol itself, most was learning this company’s specific standards such that given a task, two programmers should produce almost exactly the same code. Every part of the development process was painfully slow and cumbersome. Just running a program involves scheduling a job that ‘prints’ to a virtual printer (the terminal), and the default was still to print to dotmatrix paper that the mailroom would dutifully deliver to you. The db was still classified as ‘cardless files’, and everything we wrote could technically have been encoded on punch cards (which were not hard to find still floating around in the drawers of old timers). It was fascinating to learn all the old optimization tricks like using packed decimals and hex encoding to save just a few bits. It was the only time in my career that I’ve worked that low level. I value the experience, but definitely don’t regret giving up the job stability that may have come from staying on that path.
We could adopt a perspective where COBOL and Perl aren’t some ’ancient’ languages but as simple tools for simple tasks. You wouldn’t call a lightswitch outdated, just simple. You wouldn’t see implementing a lightswitch with a microcontroller inherently more sensible than replacing a broken switch with a new switch. We still use knives, screwdrivers, ovens and fabrics without losing our mind over how hard is to train new people to work with these ancient, outdated technologies. Yet software, even when it has very real implications for hardware, is somehow different. In software, adding complexity without any need to do so is ”future-proofing”. And how maintainable something is often measured by how well it comforms to a general design that could potentially be extended do any computable task, ever. You know, in case we in future need to integrate the lightswitch into a group-chat. If we don’t update the system to use a micro-controller now, we are absolutely screwed when we need to connect it to the family chat. If we instead think of simple tools as ever relevant to simple tasks, it makes perfect sense to train people to work with COBOL and Perl, like we train people to work with knives and switches.
I currently work for a bank and like to joke about how old some of the systems are... Then I see this and like, huh turns out maybe we are actually up to date by banking standards
Former COBOL programmer here. After having watched the video, I feel like digging into my 26yo COBOL code again. Curious whether I can find my way around the code of my younger self.
ISPF is a great editor although some similar mainframe vendor editing products have additional features not in ISPF. Any vendor that makes a similar product to IBM's must be better or have additional features to survive. Many vendors make their living creating better products than IBM includes for free........
First job, 1996, COBOL85 on Tandem mainframes in the Bank of England. Worked on Real Time Gross Settlement, Central Guilts Office 2, little bit of CHAPS and later on CREST. If our team had left/been struck down by illness the entire UK financial system (RTGS, CHAPS), bond payments (CGO2) and stock market (CREST) would have ground to a halt. And most people were legless drunk at least 50% of afternoons
is still the best hardware and operating system - dude Pathway rocks, still does, nothing yet comes close to the resilience and massively pararellness of it Hardly any RAM though, every byte counted and every variable had to be justified
3 weeks later after the party, the article was absolutely awesome and the mum is a Giga-Chad Power 10000! F this "Substack" platform for not letting me tell her how awesome she is! Great respect!
Probably the only way to "rewrite" the system is to implement a new bank system from ground up with modern tech and treat it as a new bank and slowly move accounts to the new system one by one. Though it would probably take 3-5 years before the new system would be usable.
I have a friend who too is working COBOL code bases that are decades-old and never planned for refactoring. This situation of the talent pool going away and the need still on going is also the basis of the movie Space Cowboys. There the issue is NASA has to recapture a satellite for which all of the engineering talent has retired Tommy Lee Jones and Clint Eastwood.
5:40 - We literally had a conversation like this just today. When deploying a hotfix, I asked why the API portion of a particular repo under our care had both PHP and Node in it. One of the senior engineers literally said "a guy that used to work here just wanted to try something in Node, so now we're stuck with that for ".
A "module" is a program, made by compiling one or more source files to create the application, same as now. Each module (program) can handle transaction data, various databases, and create print outs. Or a module can be created to run under the CICS transaction system, which means people type in various commands at a terminal and press function keys. Modules can "talk" to each other by one writing to a file, and a later one reading that file, similar to Unix pipes, but not simultaneous.
My last COBOL job experience was 1999. Changing two character 99 to four character 1999. 99 is a machine level check value in COBOL. My cost center, me, made several million doing conversion work prior to Jan 1, 2000.
It's pretty scary to think about the possibility for a lot of the systems we rely on and take for granted collapsing because of their complexity and the lack of people technically capable of maintaining them.
I programmed old main frame Cobol for over 30 years. Nobody should try to convert these old systems. Businesses need to document what these these old systems do, to create a requirements document for a clean new system. It’s possible to run into code that depends on tape processing, JCL executions and other weird things that are not in the code, but the code depends on them to process correctly. I’ve been retired for years, no one could pay me to analyze old Cobol again.
Old mainframe programmer here, one of my jobs used to be to analyze code that vendors wrote for us, especially if we had issues with it during or after a project. I would analyze their code and then document what it actually did for the business side. They were often surprised because it did not do what the original specifications called for it to do. Sometimes I would modify it to specifications other times someone else in house would change it.
Every language is a tool. Some tools are arguably better than others but many tools are just designed for different tasks so if wouldn't be fair to compare a wrench to a screwdriver or complain that a screwdriver is inefficient at driving a nail.
I program in PL/1, also mainframe, which is "programming language 1", but it is not really. :D Concerning the age, i think he writes its a goldmine because most young people react with "never evers", but the old programers are retiring so fresh blood was kinda needet. BTW, a module is probably somewhat comparable to a java class. But probably a bit bigger, as the legacy stuff is written procedural, not object oriented. (I am 34 and the 2nd youngest in my team, germany). BTW, they do not literally write the data on a tape, the formats just persisted, so people still have the architecture and the "naming" (i.e.: Track, Block, Cylinder etc..) P.S: To the young lad that claims you can rewrite thi code. You first have to see the codebase before you speak. I would take any bet you would not be able to. Also i think there is no need, at least where im working. Old does not equal bad. I would claim it is quite well written even, especially when i compare it to alot of modern stuff. Although most people i know would disagree, because thats what alot of people wanna do. If a code is efficient, it works properly, and is easy to read it is good code, fight me on that if you feel the need :).
To be fair mainframes are quite advanced today , DB2 is a good DB and there are now other languages than Cobol. But the legacy problem is incredible. I'm working in a bank, I was on a project in which we used tools like text analysis and network analysis in order to try to understand how data moves among tables, because nobody knows and there was no documentation... Basically it's archeology...
One of my first jobs was working on decommissioning mainframe applications for the space program. While much of the code (Adabase/Natural) was written in the early 80's, a lot of the people involved were still working when I was there, so it was common to ask questions of the original authors when looking at reports or batch processes. One time, I asked my boss if we could set up a meeting with a specific person, and he asked me to look out the window and said "Do you see that big tree out there? That's their memorial tree."
That's so cool. A bit sad too. RIP to all the legendary programmers
Welp....
@@nisonatic Imagine being dead and still being required to take calls because it was in your contract...
I recall Adabas/Natural though never got to work with it app coding and its own database from memory!
Nice u remembered
I wrote COBOL programs 45-50 years ago. The "dot" at the end of a line is a period. It was a well known error we called a "pregnant program" - one missing the period.
Lmfao 😂😂😂
It's called a full stop in Australia! We had a discussion in one of my programming classes on what programming syntaxes are called in the US and the rest of the world and there are quite a few differences.
"This is so legacy that the person that first wrote it died of old age." 😂🤣
So legacy that the original writer is dead, and no one knows if the FTP of records still goes out to an extant company
Have you ever heard the Y2K joke?
A COBOL programmer, tired of all the extra work and chaos caused by the impending Y2K bug, decides to have himself cryogenically frozen for a year so he can skip all of it.
He gets himself frozen, and eventually is woken up when several scientists open his cryo-pod.
"Did I sleep through Y2K? Is it the year 2000?", he asks.
The scientists nervously look at each other. Finally, one of them says "Actually, it's the year 9999. We hear you know COBOL."
These systems are soo legacy - that they were legacy when she started 25 years ago.
I had a moment where I opened a module and saw it was first created before my birth. That's a moment when you think damn this is legacy
Dude, this stuff is so legacy, the entire first TWO generations that maintained it are dead. The 3rd generation that picked it up has either retired, or IS retiring in the next couple of years. I had my shot at going into COBOL early on in my career, and NOPED out. It was old/creaky and nobody wanted to F with it back then, and that was 35 years ago... What was a little SQL and a code snippet in Foxpro v2 to generate a report, was PAGES of code in COBOL.
COBOL programmer positions should be passed from parents to children. They should have COBOL as an integrated part of their family name too, like a religious Order.
And the linked article is flagged as spam. WTH
The Lords of COBOL is a different story. :)
I totally agree
so basicaly like medieval Guilds
with secret ceremonies and shit
Grandmaster, masters, journeymen, and apprentices.
Lots of English names are that way: Smith, Baker, Taylor, Tanner. So would they go with Coboler, or is there something better?
@@NeilHaskins or just straight cobol
Johnny Cobol
hero we need but dont deserve
Mainframe cobol programmer / architect signing in. I could share stories you people would not believe. Mainframe is one of a kind. I've seen systems up and running (uptime) for 14+ years. I've seen code running in production developed in 1982.. Mainframe is the powerhouse of the world without you knowing about it. AMEX, VISA, Mastercard. Anytime you swipe your card to pay, it goes thru a mainframe.
Bro, I just modified a program that is still running till this day since 1974, for an insurance company.
My respects partner !
Mainframe support guy here. COBOL is robust and dependable and self-documenting. Why are legacy programs and systems around for decades? Because they did it right the first time.
@@thomaslink2685 "...they did it write the first time."? let's not get carried away - I'm pretty programming back then was just as iterative as it is today
FWIW - I work at a bank and sit next to one of the guys that did their mainframe assembly application. He's told me a few stories. #respect
same here, oldest program I've worked on was from from 1993 meaning the code is older than me.
And it's not just banks that run this stuff, every company using SAP is running into the same problem now, because SAP runs on a language similar to COBOL
I've seen things you people wouldn't believe. Batch jobs crashing into the void of unallocated memory. Hex dumps glittering in the green light of a sysadmin's console, waiting for code review. All those moments will be lost in time, like bits in registers. Time to log off.
Someone is finally speaking about my hell. Feels like I've finally been represented lol. COBOL mainframe devs represent!
tell your kids you love them, not in person because yeah we know this won't happend, atleast send them a fax.
@@outis2493😂
hell yeah
Mainframe developers, assemble!
How good is the pay though
For everyone in that chat commenting to "just rewrite it", they obviously have never seen that in the real world for a enterprise scale software suite. About 7-8 years ago a software company I was working for decided to entirely rewrite a popular product in a "modern" language, Java, rather than maintain it in C. This product handled automated file transfers. Pretty simple right? The old code was full of branches, corner cases, a "weird" one offs. The company fired the old programmer and gave the rewrite to a team of younger engineers. The release of the new version was a disaster. Each of those baffling little branches of the code was to handle some non conformant condition in integrated applications the customers were using. Banks handling thousands upon thousands of complicated transactions a second just cannot accept code that might melt down in some rare but persistent corner case. Downtime can literally cost a big enough company millions of dollars a minute.
this "just rewrite it" would end up like Windows 10 to 11 has been. Losing a lot of existing functionality and instead of reimplementing them, some ad functionality gets priority. Big enterprise scale software is comparable to OS codebase in size and complexity.
Better a horrible end than endless horror.
It can be done with the right people... but it will take very long and cost you a fortune. So this endeavor mostly only gets funded when
a) there is no other way (e.g. you fucked up before and now its too late) --> fail
b) you underestimate the time and effort needed by orders of magnitude --> fail
c) you overestimate your abilities --> fail
d) you rightly estimate it all, you have the funds, the people and time --> might work
I have heard of some very rare cases where a rewrite did work. They had everything on their side and barely made it.
Best way can be strangler pattern. Don't rewrite the whole thing in one go, just put it behind a proxy that slowly becomes the new implementation, where you can do the most important and value adding bits first.
That's 'five nines' development. Must be up 99.999% of the time, five minutes of downtime per YEAR.
I worked at a bank and they were trying to move all their COBOL to Java. They did a few years of it, canned a bunch of COBOL devs. Then figured out that they couldn't even get close to the speed of COBOL in Java. They tried to hire back a bunch of those COBOL devs and started giving new hires out of college extra $$ to learn COBOL.
Not today, but this sounds like an amazing use case for Zig/Rust where rewritng the codebase in Zig would be fast as in less human hours and blazingly fast as in cpu cycles
Edit: typo
Don't write a banking system in zig, maybe rust but not zig.
It's pre 1.0 and I don't know what they're going to do about stability once they reach 1.0
@@CamaradaArdi yeah, right now the language and its ecosystem ate far from production readiness
@@plaintext7288 Also I think that Zig is way too low level for this kind of business logic heavy applications but I could be wrong.
@@CamaradaArdi why not use c? or write a c driver that can communicate with COBOL mainframe?
I learned about the COBOL and the mainframe about 5 years into my first job as a Java/.NET/C developer. It's actually a sad story about the mainframe. The platform is fantastic; the Power8 and Power9 processors on the mainframe could run circles around the x86s running Linux, and the mainframe was doing things like LPARs, "containers" (though not called that) and VMs long before they were a thing on other systems.
The biggest problems were IBM's "walled garden" (more like "Supermax Prison Garden") for third-party software, and most importantly, the absolutely atrocious interface. ISPF, MVS, TSO, and green-screen 80 character crap was the boat anchor that sunk the mainframe. If people understood what the mainframe could do, and IBM put a decent interface and decent automation into it without crazy menus and documentation you had to pay for, then developers would have begged to develop on the mainframe. It had all the functionality of the cloud years ago, it's just hidden behind stuff meant to emulate 1950's punch cards for backwards compatibility.
Backwards compatibility is the evil
No, the biggest problem was total cost of ownership.
It doesn't matter if one IBM mainframe was 2, 3, 4 times as productive as that x86 Linux system when you could run 20 Linux systems for significantly less.
Correct. You are so correct.
I work in insurance, similar story there. Lots of COBOL, text-based applications from the 70's, and the like. About 15 years ago we stopped using actual punch cards and mainframes terminals are now emulated. Supposedly there is an old timer who lives in Fiji who we fly in on short notice any time something REALLY important breaks. Zero political will to rip it all out and rebuild. Trust me, people have tried.
It costs money and you cannot really market it to people. Customers don't really care if you use COBOL or Rust. From managers' point of view it's a system that works, migration costs money, it's way better to rely on that one guy in Fiji in case something breaks. And if the manager in question is evaluated based on saving money, this a clear "no" to any migration because it could affect the yearly bonus.
And yes, the one time investment into migration could save money for maintenance in the future, but managers don't think that far ahead because they might not even be with the company at that time.
@@ThatAnnoyingGuyOnTheInternet Yeah but at some point the net present value of maintaining and old system is greater than the net present value of replacing it, or is outweighed by other business considerations and risks, like no one understanding key operational systems. But that is clearly a long term concern.
I envy Fiji man greatly.
@@JGComments I have never seen mainframe migration which, at end, didn't cost more than original mainframe. They are priced accordingly to their tasks. Enterprise Oracle + Openshift + NAS still have both licence and running costs well into mainframe teritory. And with modern stacks you usually need more IT support people. That doesn't meen there is no value in migration. Latest migration I have seen was done purely because of future staffing.
@@bariole I agree that it would cost more. The primary issue is business risk of losing the ability to fix or maintain it, because for key systems you are putting large streams of revenue at risk which are much larger than the cost to rebuild, on a net present value basis.
My second job was at a bank, it was mainly COBOL, JCL, and Assembly for the first 1.5 years. Coming from Java it took me quite awhile to get the hang of things. Almost all of our data was stored in flat files, so in order to read those you had to go look at the code to make sure you had the right header sizes, etc otherwise it was just binary garbage. I remember thinking how dumb this was at the time and why not use a DB but now having created custom binary formats for stuff it makes so much more sense.
Also assembly code reviews, those guys printed out what looked like dictionary sized books of a single assembly module and spent hours reviewing it, still crazy to me.
1991: We must administer this IQ to test to see if you possess the aptitude for this career.
2024: Lost your job hanging drywall to Jose? Learn to code.
People learned that the IQ can be trained to some extend, or rather a low IQ is often a lack of such training and education. Sometimes the questions asked require a certain experience that people may or may not have had, too. We feel like we can get a simple answer on how intelligent someone is, but it's more complex than that in reality. The most common factor that used to be known was the kid who had afternoon classes, learned an instrument and joined a sports club vs. the kid that sat in front of the TV all day. While it's not exactly looking the same anymore, this factor of experience and training your brain remains unchanged.
Some banks were still doing IQ tests for everyone the last time I applied to one like 4-5years ago.
@@Kkubey your chatGPT tier rant didn't disprove intelligence pal
@@Jeremyak What has the world come to when making full sentences is called "chatgpt tier" already
@@Kkubey Yes, it was the sentences.. not the generic egalitarian liberal claptrap ma'am.
The article is gone. I suspect Substack thought they got DDOSed by this video. 🙂
No this was plagiarized. Looks like a few people have the same mom and story. One from seven years ago even went poof.
@@actually_it_is_rocket_science I'm sure several people had mothers that worked on these types of systems. My mother learned programming mainframes at a bank when I was 11 or 12. I had a hard time getting my parents to buy a computer for home because she had gotten tired of dealing with them at work. I didn't get to use a computer at home until my dad brought home a Toshiba laptop home from work which had the new 3.5 inch floppy disk drives! ;-)
@@actually_it_is_rocket_science Do you know if the story was at least true?
@@TheKennyWorld One line in the article was weird where he said that somebody like his mom must be making a lot of money. It's like he doesn't know how much his mom makes, which made me question the authenticity of that article (before I read this comment)
@@SandraWantsCoke Could just be regular old Swedish taboo around asking about peoples' salary. Although at least I'm pretty sure if I asked my mom it wouldn't be that much of a problem. That said I do not know my moms salary.
My aunt worked in statistics center back in USSR and wrote COBOL code translated into russian on IBM/360 knock-off.
Asianometry did a video about the Soviet mainframe and its history.
that sounds both cool and maddening
One of my bosses was russian. We love her voice :)
My first non intern job was COBOL programming because I had COBOL classes at the university. That was 2018. I graduated in 2018.
I still like Cobol, I felt like a true hackerman working with ISPF
And that was only 6 years ago. Fascinating. Like FORTRAN engineering based programs are still lurking around. They work.
My company just helped a bank of similar scope to Nordea shift their entire banking core out of mainframe land (SDC). It can certainly be done. It was a gargantuan task, but nobody noticed they did it.
> nobody is noticed
Good! That's the best way to migrate! I'm quite amazed!
So now which programming language are they using instead of COBOL?
@@ACium. I wasn't on that project, so I have no idea. No reason to think it's a single language either. It's probably Java though.
I've thought about learning COBOL because of industries going "we need people!", but man, i dont think i have what it takes.
It's a dated language but the basics (variables / units of code / control structures) are there. The real issue is the domain specific knowledge, both in technical and in business / legal frameworks that codify it.
Thats a tough niche to take, even if you learn it its still 0 expirience in the field, nowhere to get it, and even if business needs it, demand isnt great so you cant switch jobs after that, you become trapped with a skills that looks cool for another programmer, but horrible for hr.
find someone who can get you inside first
@@PhilipAlexanderHassialis absolutely... COBOL itself is the easiest part. It's the massive complex legacy systems that are the diffficult part.
It also doesn't actually pay that well. The pay is on par with any other technology.
COBOL is where you write a verbose English dissertation about how you want your program to work and hope that it actually ends up working the way described.
A language so verbose could only have been created by a woman :-P
And keep it within the columns
@@616Regislol, Admiral Hopper’s ghost is gonna haunt you for that one 😂
@@616Regis More so by committee but Admiral Hopper was the chair of the group from memory.
what's funny is if you replace COBOL with ChatGPT in your sentence, it still makes sense
One thing I find amazing about IBM and COBOL is the levels of backwards-compatibility, to the extent of using virtual tapes and disks to be compatible with physical ones. Pretty cool imo.
Well, not that cool, when you have to allocate a file size in terms of blocks, tracks and cylinders instead of bytes (cause they don't grow) and you have no idea how much bytes a block is or the geometry of the disk...
Yes - yuck. I started out in engineering on a mainframe that was TOS - a CDC 750/760 with a Tape Operating System. It was now disk drives, but you would still run your program, then have to 'rewind' the disk file to run it again. I can still probably type rewind,* faster than most other programmers.
It's been about 30 years since I touched an AS/400, but the terminal text editor assumed program statements may potentially be written to punch cards. There are reserved columns for line numbers, limiting the actual editor width to about 65-70 characters. All programs had to be formatted to fit this layout.
One of my coworkers father is a cobol programmer. Literally the youngest one at 65. They cant find anyone to replace them now that they are all retiring
There actually are plenty of people at least here at The USA.
@@lexmercatoria2774 ok? Not everywhere is The USA.
@@lexmercatoria2774 I'm in the US, and at 54 I'm one of the youngest I know that even knows COBOL, and actually had classes in College that aren't taught anymore in the US.
I began coding COBOL (and assembler) in 1979 in a bank (sort of). Was IBM Mainframe, current name is IBM System z.
I began recently to work again in my old workplace (as a consult).
They have tried to kill both COBOL and mainframes since the early 1980-ties. It has never succeded.
8:30 the age is immensely valuable there just bc of how many decades they have before retirement
in a field where almost everyone is 50-60+ as discussed earlier,,
as a cobol manager,
it’d be pretty appealing to have a 20yo employee w the same domain knowledge and it’d be smart to incentivize them to spend their whole career at yr org
feel like prime missed the point there and took it as saying that being 20 makes someone a better programmer which
i don’t think at all was the point lol
chat got it tho i guess
😂 Yeah, Prime misses the point almost as often as I do! Reminds me that he's human.
If institutions want young COBOL programmers, they need to pay competitive rates. COBOL programmers are at the low-end of the salary range in the US. Below many other languages that have been around for a while, including some surprising ones (e.g. Perl). The languages that tend to be mostly senior programmers tend to have pretty high average salaries, but not COBOL.
Every few years there's a bunch of articles claiming COBOL programmers are an impossible to find dying breed and super in-demand, and every time I see those articles, I check job listings...and every time I see that it's BS. It's always low wages, whenever employers say they can't hire people to do a job. I think it'd be fun to wrangle big iron, but not if I have to take a big pay cut AND deal with stodgy banking/finance/government culture and workplace.
@joecooper1703 Where are you seeing that? On indeed and linkedin, I see 65-85 per hour. (When listed.)
@@joecooper1703the problem is COBOL programmers don’t really bring in new cash flow or new addons the sales team can offer to clients. They’re just there to keep the lights on and the Banks running within regulation at this point. So they get treated like IT Security in that the execs won’t see the value in the investment until it bites them in the ass.
I'm one of those from the "Deep Past" that worked on this type of stuff starting in the 80's. We referred to the database as IMS DL/I. Used it at a pneumatic tool company for part explosion of a tool. As for GSAM its a Generalized Sequential Access Method that would have been pronounced "gee-sam" and not "gasm". I don't recall working with that specifically, but I do recall working with ISAM and VSAM files. My first job was with a medical billing company. Through attrition I was responsible for thousands of lines of code and at one point on-call 24/7. That got old real fast. As for ISPF, Prime, I was probably as good at that as you are on Vim! Our monitors had one color - green and the keyboard was a clacky, metal IBM one that you could kill somebody with. Oh and we sat in cubes all day in suits. Today's developers piss and moan about their work conditions - well LOL to them!
VIM.. that brings back memories
Uhm except that working in a cubicle wearing a suit sounds like the dream. These modern open plan startup spaces are all about cramming as many people as possible into a small space and making it as easy as possible for management etc to come and disturb you while you're trying to work so that they can feel more important. I work remote these days but back before covid I would have done anything for my own cubicle.
Like yourself I wrote COBOL all throughout the 80’s. Myself along with 2 other equally skilled IT members coded an entire online / interactive MRP system for a machinery manufacturing company. This ran until that monster “Year 2000” scam. We had the system that we coded in the 80’s already 2000 compliant. But management was convinced they needed some package software to survive the 2000 dates. Was the safest time of my life when we were assigned to trash our code for that canned software. Our code would still be running today if management would have had the balls to believe us ☹️
@@ronh7384 LOL Y2K...good times!
GSAM files are used when writing sequential files from some type of online applications. They are used because their updates can be automatically backed out by the system unlike traditional files.
You just drew some money out of a cashpoint? Probably an IBM Z16 running COBOL. These things are everywhere and have 99.999999999% (yes 8 nines!) reliability and are absolutely state of the art (makes x86 stuff look like a commodore PET!). IBM Z16s can run programs from the old System/360 from the 1960s and System/370 from the 1970s unchanged. COBOL is amazing at data in/out as it memory maps the files in and out with ease. It's all built in. It's insanely fast!
A lot of the bank machines used to run OS/2 1.3, not sure what they run now. I have seen a blue-screen on one, so some run Windows.
I can't tell you how many times I had to get on a System/390 and run HX to kill off COBOL programs that did infinite loops dragging down the mainframe, I was just hoping that mainframe was separate from the Uni's production workloads.
@@MorningNapalmThe IBM z16 is a server. The Cashpoint clients can run many different kinds of hardware and software. As long as they implement the correct API for the financial service providing access to the banks. As in most cases businesses buy the equivalent of a proxy service which will interact with all the different bank specific systems behind the scenes.
@@ossman11 the point is that the final back end being called is most likely a Z series system from IBM.
And change never happened wo massive problems
Had a story from my uncle who works at a bank.
A new IT employee shut down the Mainframe (don't know exactly how).
They had to call people out of retirement to fix the aforementioned person's mistake
They will change language when everyone is dead
That's wild.
Oooof
I worked on mainframe systems starting in the early 80's. I once did somethings that seemed to crash our mainframe. I talked to tech support and they said that was impossible. I showed them what I did and it crashed again. They just told me not to do that anymore.................
Btw. one of the favourite stories my father used to tell (he was an IBM SE, so he basically sold mainframes and the system software to run on them) is when he was asked to benchmark the at that time newest IBM mainframe against the offering from Amdahl. What he needed was a program to chew CPU, to demonstrate task switch speed. So he basically wrote one, in assembler. The simplest assembler program anybody could write, just a jump (sorry, they call it "branch") to the instruction itself. What he didn't realize is that while IBM hardware checked for interrupts before executing the branch, the Amdahl hardware didn't check for it until the hardware wasn't executing a branch instruction anymore. But with the next instruction being, yes, a branch, that checking for interrupts never occurred. So he basically hung the machine with this simple program.
The solution was, of course, to just add a no-op before the branch, and jump to that.
Oh, and btw. ISPF is cool (albeit scarey if you have the mainframe at your fingertips). I liked it so much, I used the PC version (called SPFPC) for part of my first programming job (Turbo Pascal + a dbase-compiler called Clipper, and the code for the latter is what I used it for). Btw. ISPF stands for "Interactive System Programming Facility". IBM had quite a knack of calling stuff very basic things, their main programming language was PL/I (literally "programming language number one" :P), disks were called DASD ("direct access storage device") etc. pp.
oh god - Turbo, Clipper, dBase IV... you've just given me a serious PTSD flashback. Nice!
Although for a short time at the beginning of my career (1999), I used ISPF and shortcuts like i.3.4 or i.2 are still in my muscle memory. The text editor was great too: only 80x25 characters but with syntax highlighting and neat copy/paste system as well
Yeah, ISPF is great. It is so powerful if you know how to use it
My first job out of college was to black box migrate applications from the IBM 4361 (VM/370) to the brand new IBM PC (developed using dBase IV and compiled with Clipper). One salary app used for proposing annual pay step/grade increases gave me fits. I could not exactly duplicate the mainframe output calculations from the same test dataset. I finally quit trying code changes and prototyped the matrix in Lotus 1-2-3 (spreadsheet) so i could see changes to the calculated values instantly. It took about an hour but I found it. The original programmer rounded the calculated row values to two decimal positions but let the machine default to sixteen decimal positions for column calculations. When this discrepancy was presented to management the decision was made to carry the error forward into the new application for backward compatibility. 😂🤣😅
@@OlivierDALET=3.4 (lol)
Started writing COBOL in 1978 on punch cards to run on an IBM mainframe. Retired 2013 after extensive career path IMS systems programmer, DB2 systems programmer, Unix administrator, Tech analyst, Performance analyst writing scripts/programs in COBOL, C, C++, Java, Visual Basic, C#, VB#, Perl, Excel VBA, SAS, against Oracle and Sybase databases and Unix/VMWare servers. Retired 10 years, coding mainly in VSCode Python with some R and Mathematica. With help from Microsoft Copilot, Meta Ai and Perplexity AI. against large public astronomical online databases. Never gets old. 👨💻👨💻👨💻
Can you be my mentor?
hahahaha, love this. did a year of vocational traning in cobol 10 years ago and work as a Cobol developer today as a 34 year old in sweden as well. my first system i worked in was 8 years older than me.
Mainframe COBOL Programmer sounds like badass!
They were only badass when you had to fix their code. (Dirty little secret - a large percentage of legacy COBOL code was actually assembler 'translated' to COBOL so that the assembler programmers could keep their jobs. Most of them hated 'high level languages' and thought assembler was superior because assembler "gave them complete control over the computer".
You can't come close to the bandwidth mainframe COBOL has. Look up UA-cam videos on mainframes.
My first job out of college a couple years ago I was hired at a consultancy. The first project I was placed in was for an old manufacturing company who needed their ERP system modernized since the old one was written in COBOL and running on some ancient IBM mainframe, also the guy who had written 90% of it had retired long since and nobody fully understood all the logic that was integral to the functioning of the company. Had to spend quite a bit of time combing through old COBOL code trying to understand what it did so I could reimplement the same logic in the new system, never again.
someone in the future is going to go through your code and think what idiot wrote this in C or whatever you used and have to rewrite it in excel.
That's silly. I work in the ERP space. We just record the in and out and then create requirements from that. No need to analyze code, just behaviour
@@mattymattffs ya that's what we tried telling the customer, but they didn't understand how it worked and wanted the new one to work the exact same as the old one. They even wanted me to replicate arithmetic errors caused by truncating decimals in intermediate steps of a calculation. And you know what they say, the customer is always right.
What’s surprising is that it wasn’t all documented when they had to remediate it for Y2K.
So I'm a young-ish hire at IBM and I'm learning all of these technologies. IMS, DB2, ISPF, this is all just another staff meeting for me. It's pretty cool to hear an "outsider" hearing about it for the first time and offering their thoughts. I don't do COBOL, but I do some programming in REXX, which you could sort of compare to Bash for the Mainframe.
I'm a former COBOL programmer trying to learn REXX in new role. Where I work, we've hired recent grads to learn mainframe. It's not going anywhere.🙂
I found learning Python really easy after coding in Rexx on the mainframe for about a decade.
In the 1980s, I worked in banking, but tried to avoid COBOL. I was programing in TAL (Transaction Application Language) on parallel processor, fault tolerant Tandem Non-Stop systems using relational databases. They were acting as the front end for ATM and wire transfer networks for IBM mainframes. There was an interface on the Tandem system that made it look like a tape drive to the mainframe. Every night, a batch job would run to update the account information (balances, etc.) on the Tandem ATM system, so I was tasked with development on both the Tandem, and on the mainframe in COBOL. Also, the Tandem used a client/server environment where the client programs for the terminals were written in SCOBOL, a version of COBOL (Screen COBOL).
There was nothing quite as irritating as submitting a batch job on the mainframe to test on the interface and waiting for the job to execute just to find out that you missed a period.
My dad is a now retired Cobal programmer for a large insurance institution. These stories had a very familiar tone to the ones I heard growing up. I sent him the video and am seeing what he thinks of it.
My mom programmed in COBOL since the mid 1970s. I learned about computer code by reading the stacks of greenbar she would bring home with code listings on it generated after they input thousands of punch cards. She retired in the mid-90s and took on a much less stressful job at a local museum instead. She, however, stoked an interest in computers and programming into me. I was a software dev for well over twenty-five years and now I work in the appsec world. Of all the stress I have had in my career, none of it would match the stuff she had to go through in hers. 24/7 on call along with having to make everything write the first time because it would take hours to fix even a small mistake and resubmit a job.
Started coding in COBOL in 1980 and still am. Sometimes knowing legacy systems is the pathway to job security. :-) I hope to retire next year. I'm tired.
I worked in banking in the past and is rough. I also caused an outage that put us in the news…not a fun week
I got my first job at IBM as Mainframe batch processing operator and I have to say it was experience like no other. We were working for a large bank in Poland and I do remember some of those things which article is about like batch processing, transactions etc. There were situations where batch processing (at night) was prolonged, e.g. due to the size of the processed data or some error, and as a result, certain jobs that acted like a "cut off" of the day (which normally should have been run at night when most stores were closed) were run in the morning when the stores were already operating, which led to a temporary (approx. 15-20 minutes) blocking of their payment terminals. Fortunately, I haven't worked this way for over 20 years or more like others but I'm so grateful that I had the opportunity to be able to work with such people and see what it's like. And the responsibility is huge when it comes to banking and money-related operations.
I remember our operators leaving the newbie in charge while they piled off to the pub for a birthday celebration ( oh, the good old days); all he had to do was feed the printer and call the pub (across the street) if a problem came up. They returned to find there was a logical recursion loop in the job execution sequence, four hours of duplicated reports :D
It would be fun if you'd actually write and show some modern COBOL. I hear the modern stuff (that runs on VMs, I guess) isn't even bad. It's just a DSL for business. The first-ever, even!
The old fashion real stuff isn't bad either!
This is a great article. Loved your read of it. My first paid programming job was in 1991 (Turbo Pascal 5.5, pre-Windows). This was for a very large 'Britiain-based Telecommunication' company. We had to call statistical library code in MSDOS Fortran.
The biggest change in the last 30+ years, to me, is the editing and compiling tools. The conveniences that Sublime Text or VS Code (even VS) bring would have been incredible to 21-year-old me - far more than any graphical, speed or language change.
Languages change, algorithms wax and wane, libraries and APIs come and go. But back in the day we often had no easy way to read one file while editing another, and build times could be measured in tens of minutes (it was easy to forget what you were tracking down and trying to deal with, when you had an enforced 10 minute break before you could check the outcome of your tweak). Your (physical) notebook was your friend. The official documentation was not a PDF or a website but instead in dead-tree format (often excellently-written).
I'm a programmer in a bank, something you struggle with when you start (I did as well) is how much process and gates there are to getting things done. I can have a fix ready in a couple of hours, but it can take 3-5days to be deployed going through deployment processes. This isn't because we like process, but because we have regulatory obligations, segregation of duties constraints, risk and security controls, and more importantly we are dealing with money and sensitive PII, a transaction screwing up and somebody not getting paid has big real world impacts on people. Failures of the platform I work on can cost us a lot as it's used for regulatory reporting, anti money laundering and fraud monitoring.
We still have mainframe systems and handle EBCDIC encoded data produced by them.
Any publicly traded business actually.
I have seen the IBM Mainframes kick up at the stroke of midnight many times. Ahh the good ole days.... I don't miss them lol.
That is one bad ass momma bear....handling the kiddos while wrangling COBOL omegalul, she is factually a BASED super mom.
I've worked in a shop with both mainframe and PC development and the research studies I've seen at many other larger companies show that many migrations cost millions of dollars and take years to complete. I've even read about several that cost millions and took years and at the end of the project, it was too incomplete/buggy to proceed so they scraped it. It is indeed NOT a trivial task. I'm not a mainframe advocate but that platform is designed for high throughput and heavy I/O and is very stable. Of course it comes with its own set of drawbacks and limitations but that is true of everything in life. Use the best tool for the job.
A company I worked for in the 2000's had a large codebase written in Honeywell assembler in the 1970's. After a few tries at migrating the code to something modern without any success (no one knew everything about everything in the system) and the Honeywell hardware it ran on was no longer available, it was decided to write an emulator in "C" to run the code on IBM hardware running AIX. Because I had been a tech for Honeywell I was tasked to explain all the wierdness of the H2000 instruction to the youngish engineers who were writing the emulator then figure all the tricks the original programmers had done that didn't exactly conform to the way the instruction set was described in the manuals. When everything was finally running it outdid the original hardware by a large factor (The original Honeywell hardware ran at 4 MHZ). As far as I know that system is still in production.
When I started as a mainframe programmer back in the early 80's one of the first jobs I had was to work on a team that was replacing an Assembler system written in the early 60's. In order for that system to run nightly they had to IPL (initial program load - reboot) the IBM mainframe and run an emulator just for that system and then re-IPL when it was done. Operations was so happy when we finally retired that system.
You should talk to Veronica Explains (@VeronicaExplains). She is a COBOL dev (and also sys admin?) and has a great channel. Getting her perspective would be super interesting.
One of my university courses was making a payroll system using cool and power house. It was over 300 pages when I compiled and printed it. I was the only person in the class to get 0 errors
Yeah. I'm in the position of maintaining about a million lines of code written in an ancient compiler. I actually wrote most of it in the past twenty-five.years.
It just happens to be a mission-critical program for oil refineries around the globe.
Does it pay a squillion?
When you understand a language, you also understands where it shines the most. For banking Cobol was perfect because it handles transactions very well. However, the last time i was working with it, i understood that at that time, the transactions were done at night and in a couple of mainframes.
Can agree about the statement on rewrites. I'm working for a saas that has existed for about 10 years now, written in PHP and react originally and in the process of rewriting for the last 2+ years at this point... We're still not at the agreed "feature parity"(which would skip implementing some features in the new version), despite having more than twice the number of devs the old system saw in its peak(4 vs 9). Even though I've been here for 9 years, I didn't foresee how big of a waste of time it would be to rewrite everything(including backend) from scratch... Don't do it folks, just don't do it 😂
42 years in IT and almost all in COBOL development. Worked for a bank, at a large corporate office and for the last 28 years for a company that had a credit card processing application that ran on an IBM mainframe. One of my primary tasks was developing API transactions that allowed mid-range applications to use mainframe transactions. We use MQ Series communications which is an IBM product that allows any remote application to communicate with the IBM mainframe. It was highly successful. We used it to develop web based customer service applications, and customer facing web sites. This allowed internal users and customers to use web sites to access what they needed without interfacing to mainframe screens. Basically, we put a pretty face on the mainframe but all the real core processing still happened on the mainframe. Our mainframe system had approximately 6,000,000 lines of COBOL and completely rewriting it would be extremely high cost and high risk.
Every year that I've worked at my current employer it's been, "this is the year we're going to migrate away from the mainframe." They never die.
Mine has finally accepted that it's now about modernizing (the access to and what runs on) mainframe, not leaving.
Around 1990 the mantra was 'the mainframe's dead' and 'paper's obsolete. IBM and wood-pulp stock tanked. I failed to secure my 'furniture' loan and so missed that boat completely :D
Worked in a life insurance and pension company with mainframe code migration in 2012. Looked into the Cobol code and found a comment in a module:
Feb. 2008, This code has year 2000 bug checked.
This code was designed to run when the developers probably has died 😮
Ive only poked around COBOL out of curiosity but... I kinda unironically loved it. I do need to do a bunch more (and do have a couple of project ideas for getting deeper in to it) I dont yet know it nearly well enough to judge how much Id like it in an actual project, let alone a giant legacy one doing essential jobs, but it does interest me.
IMS is hierarchical database built on a top of mainframe file system. Mainframe has filesystem which has more in common with ISAM database layer, than a modern disk filesystem like ntfs or ext. Mainframe files are of fixed size, with segment growth, and they are structured, and there is a transactional support for everything etc. There even exist "The ISAM" (a product) and it **was** developed on mainframe and then concepts of it were copy pasted to all other databases. DB2 is classic relational database, and on mainframe it is effectivly sql interpreter + indexing running on top of file system (remember it structured and transactional). IMS and DB2 on mainframe are like having two query/organizational engines on top of the same data layer.
Anyway it seems they are doing some batch bank clearing job (scheduled file upload) which is not very well organized. Like chosen data structure is not particuallary suitable for a task at hand, so their way of quering is to run lineary trough data, or maybe even with multiple passes. Hierarchical databases like IMS or LDAP are fast if you are quering hierarchy.
If you are going against hierarchy, well it is like taking SQL, doing cartesian product of multiple tables and than quering that. 10 Tb of data is not a lot for even smallest mainframe. For revelance even smallest z16 mainframe, like one rack of computer, comes with terabytes of ram to begin with, and has cpu cache size well into gigabytes.
I guess somebody designed something small 50 years ago, and now business around that program has grown to monumental scale. Now there are multiple programs doing god knows what around original data structure designed one Tuseday afternoon for a minor program which become widly usefull for particular period of time. That program is offline since early '80s but data structure persists.
They could migrate, strangle the system, but as long as business side sees no value in migration everything is going to rest the same.
I am a mom who was a cobol programmer for 20 years, just recently shifted to a more mainframe system admin role. It's blazingly fast for transaction processing, nothing can match it. If it slows down under 60 milliseconds per transaction (via a CICS plex - CICS programs are written in COBOL) we look at where the issue might be. It slows down dramatically when it calls a non mainframe system for something, like currency conversion.
Like Tuxedo service over TMA, yup.
What an absolute nightmare. I've written a CNC control system entirely in Motorola 68000 Assembler, and 30 years on, the machines are still running without any software updates, hard drives or anything from the modern world. Every bit of that code is exactly as it was written in the final iteration and never changes. That's a huge problem if it has bugs, but when it works, at least you know that it will forever be the same.
Embedded systems used to be a nightmare because we couldn't easily fix things in the field. Nowadays, you can sometimes update them over the air. However, it's still scary to know that millions of units are going to be out in the field, and you can't fix them if you've made a terrible error.
Thats badass man
Remember we are about 25 years after Y2K and we are discussing the same subjects
I left IT a few years after Y2K. Our divisional head hated our section. He thought mainframes and UNIX were a waste of money as Windows was the future. The first he got rid of was our manager. We worried about him because he was a COBOL programmer who spent more time looking after the payroll system than managing us tech support peeps. Yet he was never out of work once he left. He worked less; only about 8 months out of 12. He ended up earning twice as much money too. The other chap (from another section) they made redundant was a PC chap. He ended up working in a warehouse stacking shelves.
@@clangerbasher Damn
@@javajav3004 Our divisional head was a lunatic. It was as if he thought we were responsible for the company owning the mainframe. It was a COBOL application that prevented the mainframe from being switched off. They eventually cobbled together a 'system' to replace it. But it was a disaster. Luckily the company joined a much larger entity about 2 years later which saved its bacon. The obvious solution to us in tech support was for the company to have bought a nice AIX server and just move the application across on to it.
My dad worked with a man in the 1990s whose mother came out of retirement because she was a COBOL Queen.... and that was 30 years ago... the world still runs on it.
Node is cutting edge in a sense that it's going to hurt you.
A “module” is a set of code (with usually one entry point and one exit (return) point) designed to perform a limited task. It would be “link-edited” with other “modules” prior to execution. There is usually a “control module” (which controls the logical flow of the program) which “calls” many sub-modules by passing parameters to it and the sub-module “returns” control to the control module, along with the result(s).
For example many modules would be Lind-edited together (to form a program) and execution would be passed to the control module to calculate payroll. The payroll “control module” would pass a set of parameters to a sub-module responsible for calculating all state income taxes. The submodule would calculate the tax for whatever state requested using the parameters also passed, and return the calculated amount to the control module. The control module would then call subsequent submodules until all functions were accomplished.
I am a Y2K “survivor.”
Mainframe Programs are written in a specific language (COBOL, ASM, Fortran, etc...). They are then individually turned into Object Modules (machine code) with specific characteristics through a Compile or Assembly process. Object Modules then go through a Link Editor or Binder process to create a Load Module with a specific Entry Point and characteristics. Load Modules are what mainframes actually execute (EXEC PGM=Specific Load Module) even if they contain only one actual user created Program (as Load Modules always contain multiple system supplied Object Modules as well). If you have multiple Object Modules working together then the Link Edit or Binder process would combine them into your created Load Module assuming they are Statically referenced. Any Programs that you want to Dynamically reference have to be turned into their own Load Modules and placed in an appropriate Load Library and normally will NOT be included in the original executed Load Module (as they will be loaded by the system when actually referenced).
So Programs are written in a specific language and turned into Object Modules. Load Modules (always containing multiple Object Modules) are executed. Load Modules can contain hundreds of Object Modules combined together (from multiple different programming languages) and can reference other Load Modules (containing multiple Object Modules) Dynamically during the same execution.
The government agency I was hired into 7 years ago is on year 14, 15? of project “get off the mainframe”
I remember working with a small company, the government updated the API to handle tax information and transport manifests, the company lost a lot of money on the service downtime while the devs were scrambling to get everything up to date, the last thing that crosses your mind at that moment is a rewritte
When I heard you pronounce SQL as squeal I physically was injured.
I hate how he does that 🤣 pretty sure he does it just to annoy me.... And it works 😡
Sometime around 1991, I created a scrollable listbox/combobox in Cobol on a Vax system that used VT 320 (they might have been 220) terminals. I created a fully functional text editor(modeled after some Unix editor that I forget the name of). These were monochrome monitors with no graphics cards of any type. There is nothing wrong with the Cobol language. The problem is in programmers that lack the necessary vision and imagination to create solutions to problems.
No most are using modern High Level programming skills and forget to dumb the shit down that will work with COBOL.
We are only a 100th (by GDP) the size of the US. But all the major banks where I live all moved to much closer to realtime, like if I transfer money to someone else's account at a different bank it'll take at most 15mins, normally significantly faster. This only became a thing in the last 5 years or so here. But the idea that banks can't change feels a little dumb, feels more likely that they don't wanna change.
I worked in banks in Mexico, non of them used COBOL, other Mexican banks do use COBOL but they were all replacing it with the new COBOLs, Java o C# part of the reason is that mainframes are way too expensive and only American or EU banks can afford them. So even the largest Mexican banks with COBOL mainframes are using second hand small mainframes and they keep using it because they are still freaking fast, a college professor told me like 10 years ago that their mainframe was still faster than their cluster.
"just rewrite it" - It'd be nice if we could have some sort of legal fine for people that misuse the word just. "Just fly to the moon", "Just secure it" "just get it done" "just figure it out". Rewriting something large and established is almost always a disaster. Maybe in the long run done well it could be a good idea but you'd take your problem of lacking people with legacy development experience to maintain an old system and turn it into the problem of lacking dramatically more of those same people who also are competent in whatever new language you want to use so that you can maintain the old system while writing a new one. The best approach is likely something that can interoperate with the old system so new stuff can be written in something more modern and old modules used or updated as required but I'm not sure that tooling for such a thing exists.
Well, I tend to agree, let’s play devils advocate for a minute.
If you swap out pieces you are forever stuck with the original workflows, and business processes, when starting over might make a lot more sense.
An overly basic idea, we replaced printers with terminals and 50 years later we are still stuck with CR LF interoperability issues that would never have happened if you’d started completely fresh with a variable sized semi-graphical terminal. But we replaced printers with digital versions and only later added new stuff.
How many decades did it take to replace BIOS?
In business this happens too, the old program’s capabilities and limitations created the business process, and now the business process (along with 1:1 interoperability with code on the old system) dictates how new code is developed.
A fresh start is incredibly disruptive and often not the best choice, but replacing individual pieces without re-thinking the overall design is also very costly in the long run.
@@thedave1771 for a bank many business processes are set out in law and the complexity of them is why the COBOL code is so valuable. For other orgs changing business processes is a major undertaking for an organization and the software is very unlikely to be the limiting factor in an organization large enough to have their own custom software.
A Fresh start for a business is a decision the business could make but it is very risky. Time and again we've seen with software projects total rewrite for large systems are total disasters. Remember KDE 3 -> KDE 4? Hell Python 2 -> 3 was insane and it really didn't change all that much.
I have been a COBOL programmer since 1987! My first was a Sperry Unicac S-80, my second IBM-AS/400, and for PC ( Motorola z-80, and DOS 3.2) was RM/COBOL. Now I am 60 Years Old! A living Legend.
"There's nothing wrong with Cobol"
"Our bank was down 16 hours straight because someone forgot to add a dot, and we have no local dev environment to catch basic mistakes like these"
to be fair, that seems like an issue with how they have their development cycle set up, not with COBOL the language itself. missing a semicolon or parenthesis in a modern language will do the same thing if you didn't have fancy IDEs with squiggly red lines to warn you lol.
That's not a COBOL nor a mainframe issue. Every client I've worked for had multiple local dev and certification environments.
WTF does Cobol not have a compiler or something that does simple syntax / static analysis??
That's because someone didn't test the change in a lower environment first. Not because it was COBOL.
That's a problem with development practice, she even says later on that she isn't that worried about pushing code since they have a robust test environment. I wouldn't be surprised if the 16 hour downtime was the impetus for creating the test environment. It's not the language, it's the lack of testing that caused the downtime.
I was also born in '64. I had a course in cobol programming back in '84.
Everything about that course was unpleasant: the IBM operating system, the line editor for editing our files, disk crash that lost our final assignments).
I felt a bit disillusioned at the end of my master's (not CS) and applied for a COBOL job since no Real Programmers want to go near it and the money is sick. The interviewer said they're looking for someone passionate about insurance, who can really dive deep into the field and stay for the long haul. That's when I realized what universe of boredom I was gazing into.
Yeah, I think that's the most unappealing part of COBOL. The problem is not the language itself. It might suck, it might be weird -- JS, PHP, and Rust are idiosyncratic, "weird" languages themselves. The problem is that you _know_ you're only valuable as an expert on a technology that is not yet dead -- even if everyone involved wants it to be.
Your employer's incentive is to have you stay with them for as long as possible, hopefully until retirement, tackling the same types of problems from the same perspective in the same business domain. However, the longer you stay with that technology stack, the more you're missing out on newer stuff, and burying yourself deeper into the COBOL "stack" -- and _that specific business_' business model.
If you're a run-of-the-mill "full stack" dev -- even if you're mostly focused on one specific thing, like front-end -- you can hop between wearing front-end, back-end, and even infrastructure hats. Think you've been way to long on the same project/team? You can easily switch that for another team -- or another job. Work with COBOL at a bank? Sure it pays well -- great even -- but that's _the_ thing you'll be doing for a loooooooooooooong time.
I have some COBOL videos. It's a pretty weird language, maybe whose worst deficit is the complete absence of dynamic memory allocation.
I work on an ERP system that is COBOL in the backend; modern paradigm devs always laugh at me and call it virtual punch cards.
This is incredible. I had a short stint working contracting with Government entities. A lot of their systems were in IBM mainframe COBOL systems that we interfaced with integrations. I didn't do a ton with it but it was very interesting stuff and a major source of pain to some of my coworkers.
I did Cobol and CICS in 1995 as a college internship. I'm not fond of the language, but I could be convinced to do it again given enough money.
Funny you mention that. I am reading; "CICS A How-To Guide for COBOL Programmers".
@@scottstempmail9045 I still remember laying out screens on graph paper and using a tool called mapcomp to generate the BMS macros for each transaction. The stuff I learned doing CICS served me pretty well when I started doing web programming with CGI since it's basically exactly the same programming model.
i learnt some COBOL in 1995 before learning C, not sure how close this was to mainframe COBOL but i enjoyed it. the Y2K "bug" had nothing on the problems we storing up for ourselves with COBOL under pinning the financial system. I am now a thor goblin using gamemaker for fun.
jeezus, I thought IBM AS/400 was primitive
I respect COBOL and SCADA programmers...yall are a special breed
I looked up SCADA and I see Human Interfaces and PLC for factorys, Servers, Is SCADA to run a factory? Was there any factorys automated by mainframe?
@@thevincent1015 I think some are...but the main thing is, they are similarly difficult/complex. We take so much for granted these days lol
Never got to touch a newfangled AS/400. Spend my formative years coding RPG and COBOL on System/34 and 36 minis. 🤓
I know nothing about this subject but the word Cobol in the title caught my attention. I'm 67 and when I started college back in 1974 I found myself living in a dorm filled with engineering students. They were all absolutely obsessed with learning Cobol. There seemed to be endless Cobol classes and the students clearly considered learning it to be essential for their career success.
COBOL and FORTRAN were the main languages of the 1970s.
Damn and I thought the 15 year old Visual Basic I have to deal with at work was “legacy” 😂
Was taught mainframes and cobol at my first job after getting my cs degree in 2002. The biggest thing I think this article missed was how incredibly strict their code standards are. Only a small portion of the training was learning cobol itself, most was learning this company’s specific standards such that given a task, two programmers should produce almost exactly the same code.
Every part of the development process was painfully slow and cumbersome. Just running a program involves scheduling a job that ‘prints’ to a virtual printer (the terminal), and the default was still to print to dotmatrix paper that the mailroom would dutifully deliver to you. The db was still classified as ‘cardless files’, and everything we wrote could technically have been encoded on punch cards (which were not hard to find still floating around in the drawers of old timers).
It was fascinating to learn all the old optimization tricks like using packed decimals and hex encoding to save just a few bits. It was the only time in my career that I’ve worked that low level. I value the experience, but definitely don’t regret giving up the job stability that may have come from staying on that path.
We could adopt a perspective where COBOL and Perl aren’t some ’ancient’ languages but as simple tools for simple tasks. You wouldn’t call a lightswitch outdated, just simple. You wouldn’t see implementing a lightswitch with a microcontroller inherently more sensible than replacing a broken switch with a new switch.
We still use knives, screwdrivers, ovens and fabrics without losing our mind over how hard is to train new people to work with these ancient, outdated technologies. Yet software, even when it has very real implications for hardware, is somehow different.
In software, adding complexity without any need to do so is ”future-proofing”. And how maintainable something is often measured by how well it comforms to a general design that could potentially be extended do any computable task, ever. You know, in case we in future need to integrate the lightswitch into a group-chat. If we don’t update the system to use a micro-controller now, we are absolutely screwed when we need to connect it to the family chat.
If we instead think of simple tools as ever relevant to simple tasks, it makes perfect sense to train people to work with COBOL and Perl, like we train people to work with knives and switches.
oh yeah they are great at their jobs - batch processing
I currently work for a bank and like to joke about how old some of the systems are... Then I see this and like, huh turns out maybe we are actually up to date by banking standards
Former COBOL programmer here. After having watched the video, I feel like digging into my 26yo COBOL code again. Curious whether I can find my way around the code of my younger self.
ISPF is the Best Editor Ever!
ISPF is a great editor although some similar mainframe vendor editing products have additional features not in ISPF. Any vendor that makes a similar product to IBM's must be better or have additional features to survive. Many vendors make their living creating better products than IBM includes for free........
First job, 1996, COBOL85 on Tandem mainframes in the Bank of England.
Worked on Real Time Gross Settlement, Central Guilts Office 2, little bit of CHAPS and later on CREST.
If our team had left/been struck down by illness the entire UK financial system (RTGS, CHAPS), bond payments (CGO2) and stock market (CREST) would have ground to a halt.
And most people were legless drunk at least 50% of afternoons
Did write an http server in C and ... TACL .. on it and zip (de)compression to give a directory structure deeper than "drive/directory" to use for it
is still the best hardware and operating system - dude Pathway rocks, still does, nothing yet comes close to the resilience and massively pararellness of it
Hardly any RAM though, every byte counted and every variable had to be justified
Oh, those long afternoon 'staff meetings' in smoke filled conference rooms where we slowly sobered up before quitting time :D
3 weeks later after the party, the article was absolutely awesome and the mum is a Giga-Chad Power 10000! F this "Substack" platform for not letting me tell her how awesome she is! Great respect!
Probably the only way to "rewrite" the system is to implement a new bank system from ground up with modern tech and treat it as a new bank and slowly move accounts to the new system one by one. Though it would probably take 3-5 years before the new system would be usable.
Correct. I've done this a few times. Slowly move over and have lots and lots of tests
I have a friend who too is working COBOL code bases that are decades-old and never planned for refactoring.
This situation of the talent pool going away and the need still on going is also the basis of the movie Space Cowboys.
There the issue is NASA has to recapture a satellite for which all of the engineering talent has retired Tommy Lee Jones and Clint Eastwood.
5:40 - We literally had a conversation like this just today. When deploying a hotfix, I asked why the API portion of a particular repo under our care had both PHP and Node in it. One of the senior engineers literally said "a guy that used to work here just wanted to try something in Node, so now we're stuck with that for ".
A "module" is a program, made by compiling one or more source files to create the application, same as now. Each module (program) can handle transaction data, various databases, and create print outs. Or a module can be created to run under the CICS transaction system, which means people type in various commands at a terminal and press function keys. Modules can "talk" to each other by one writing to a file, and a later one reading that file, similar to Unix pipes, but not simultaneous.
The love, humour and respect you have is amazing. You’re a gem prime.
My last COBOL job experience was 1999. Changing two character 99 to four character 1999. 99 is a machine level check value in COBOL. My cost center, me, made several million doing conversion work prior to Jan 1, 2000.
10:40 COBOL developers are never fired, they die of old age.
It's pretty scary to think about the possibility for a lot of the systems we rely on and take for granted collapsing because of their complexity and the lack of people technically capable of maintaining them.
I programmed old main frame Cobol for over 30 years. Nobody should try to convert these old systems. Businesses need to document what these these old systems do, to create a requirements document for a clean new system. It’s possible to run into code that depends on tape processing, JCL executions and other weird things that are not in the code, but the code depends on them to process correctly. I’ve been retired for years, no one could pay me to analyze old Cobol again.
Old mainframe programmer here, one of my jobs used to be to analyze code that vendors wrote for us, especially if we had issues with it during or after a project. I would analyze their code and then document what it actually did for the business side. They were often surprised because it did not do what the original specifications called for it to do. Sometimes I would modify it to specifications other times someone else in house would change it.
Every language is a tool. Some tools are arguably better than others but many tools are just designed for different tasks so if wouldn't be fair to compare a wrench to a screwdriver or complain that a screwdriver is inefficient at driving a nail.
Oh my god finally a video for meee
I program in PL/1, also mainframe, which is "programming language 1", but it is not really. :D Concerning the age, i think he writes its a goldmine because most young people react with "never evers", but the old programers are retiring so fresh blood was kinda needet. BTW, a module is probably somewhat comparable to a java class. But probably a bit bigger, as the legacy stuff is written procedural, not object oriented. (I am 34 and the 2nd youngest in my team, germany). BTW, they do not literally write the data on a tape, the formats just persisted, so people still have the architecture and the "naming" (i.e.: Track, Block, Cylinder etc..)
P.S: To the young lad that claims you can rewrite thi code. You first have to see the codebase before you speak. I would take any bet you would not be able to. Also i think there is no need, at least where im working. Old does not equal bad. I would claim it is quite well written even, especially when i compare it to alot of modern stuff. Although most people i know would disagree, because thats what alot of people wanna do. If a code is efficient, it works properly, and is easy to read it is good code, fight me on that if you feel the need :).
This video made me realize money is a much faker concept than you realized.
Money is what some random over worked cobol programmer in their cubical desided..
To be fair mainframes are quite advanced today , DB2 is a good DB and there are now other languages than Cobol.
But the legacy problem is incredible. I'm working in a bank, I was on a project in which we used tools like text analysis and network analysis in order to try to understand how data moves among tables, because nobody knows and there was no documentation... Basically it's archeology...