Jump to content

Can COBOL ever be eradicated?


Ivanhoe

Recommended Posts

Forcing you to have to write constructors and destructors whether or not you use them for intialization or cleanup. Waste motion.

 

WRT try-catch logic, I like being forced to explicitly handle exceptions. I know Saint Bjarne would call me mentally lazy for it, but programming is tough enough without having all the "freedom" that he included in C++.

 

You wouldn't need to write any constructors / destructors unless you were designing said RAII guard classes, which you most likely wouldn't since they already exist and come as part of the respective library (if it honors the RAII idiom). Here, let me give you an example showing how handy RAII can be. Two functions that essentially do the same thing; one with and one without RAII.

 

// mutex to synchronize access to data shared across multiple threads
boost::mutex myMutex;

// the shared data
int mySharedInt = 12345;

bool someHorriblyComplicatedFunction_notUsing_RAII()
{
   // Before doing anything with our shared data, we need to acquire the mutex.
// Beware that we __MUST__ release the mutex before we leave the function.
   myMutex.acquire();

   if ( mySharedInt < 6 )
{
	if ( mySharedInt == 3 )
	{
		mySharedInt = 1;
		myMutex.release();
		return true;
	}
	else
	{
		if ( mySharedInt % 2 == 0 )
		{
			mySharedInt = 3;
			myMutex.release();
			return false;
		}
	}
}
else if ( mySharedInt >= 55 )
{
	if ( 55 )
	{
		mySharedInt = 12;
		myMutex.release();
		return false;
	}
}

// some operation that might throw
try
{
	std::vector< bool > b;
	b.reserve( 5000000 );
	
	// ...
}
catch ( const std::bad_alloc& )
{
	myMutex.release(); 
}

mySharedInt = 677;

myMutex.release();

return true;
}

bool someHorriblyComplicatedFunction_thisTime_WITH_RAII()
{
// create RAII guard ... handles acquiring / releasing the mutex automatically
const boost::mutex::scoped_lock sl( myMutex );

   if ( mySharedInt < 6 )
{
	if ( mySharedInt == 3 )
	{
		mySharedInt = 1;
		return true;
	}
	else
	{
		if ( mySharedInt % 2 == 0 )
		{
			mySharedInt = 3;
			return false;
		}
	}
}
else if ( mySharedInt >= 55 )
{
	if ( 55 )
	{
		mySharedInt = 12;
		return false;
	}
}

// some operation that might throw ... but this time we don't really care if it throws since the mutex will
// be released automatically if it does
std::vector< bool > b;
b.reserve( 5000000 );

mySharedInt = 677;

return true;
}

 

Now you tell me which variant is easier to read / maintain? Which offers more opportunities to shoot yourself in the foot by forgetting to release the mutex? Which has more lines of code? Still think RAII is a 'waste motion'? :)

 

 

Why do I need the "right" libraries when the C# standard System assemblies give me everything I need and more, pre-compiled, out of the box?

 

Oh, I'm not arguing against using C# ... C# is a fine language (it actually supports RAII, too!), it's the notion that Java is somehow a superior language that I don't agree with. :)

 

 

EDIT: The above is pseudo-code ... I don't expect that to compile without errors.

Edited by Red Ant
Link to comment
Share on other sites

Guest aevans

Doesn't

if ( 55 )

always evaluate to "true" in C++?

 

In any case, like I said, I don't find having to explicitly release resources to be a bad thing.

 

And when I was using C++, you did have to include a destructor if you wanted to automatically release any resources invoked with the

new

keyword. (I forgot that C++ compilers can give you default constructor.) I didn't even mention the nasty bits about having to invoke

delete

every time you're through with an object instance. I very much like garbage collection. It eliminates all of that.

Link to comment
Share on other sites

Doesn't

if ( 55 )

always evaluate to "true" in C++?

 

Yes, that was an oversight on my part. Should have been "if ( mySharedInt == 55 )" ... but as I said, the example I gave is merely pseudo-code ... I did not check for syntax errors.

 

In any case, like I said, I don't find having to explicitly release resources to be a bad thing.

 

It's generally more prone to errors, though. The more work you can offload from the programmer to the compiler, the less potential for errors there is.

 

And when I was using C++, you did have to include a destructor if you wanted to automatically release any resources invoked with the

new

keyword. (I forgot that C++ compilers can give you default constructor.) I didn't even mention the nasty bits about having to invoke

delete

every time you're through with an object instance. I very much like garbage collection. It eliminates all of that.

 

Implementing custom destructors is always optional. It depends solely on whether or not your objects own any resources that need to be cleaned up when said objects are destroyed. If they don't then the default destructors generated by the compiler are entirely sufficient.

 

I didn't even mention the nasty bits about having to invoke

delete

every time you're through with an object instance. I very much like garbage collection. It eliminates all of that.

 

And once again, RAII rears its ugly err I mean pretty ;) head.

 

void blah()
{
   boost::shared_ptr< MyClass > pointer( new MyClass );

   // no explicit delete required ... our smart pointer will take care of that automatically on going out of scope
}

 

Using smart pointers it's absolutely trivial to write C++ programs without a single explicit mention of operator delete.

Link to comment
Share on other sites

By the way, I'm not arguing against the usefulness of garbage collection either. If C++ were to gain that feature, I'd embrace it readily. I absolutely do need my RAII, though, and I refuse to even consider using a language that doesn't have it.

Link to comment
Share on other sites

Guest aevans

It's generally more prone to errors, though. The more work you can offload from the programmer to the compiler, the less potential for errors there is.

 

See, I look at it from an entirely different perspective. I don't think I should have to invoke delete for every application resource I allocate with new, but I do like being forced to pay close attention to OS and network resources like file handles and database connections. It reflects a sense of priorities, IMO.

 

Implementing custom destructors is always optional. It depends solely on whether or not your objects own any resources that need to be cleaned up when said objects are destroyed. If they don't then the default destructors generated by the compiler are entirely sufficient.

 

But what useful object doesn't instantiate and use stuff that in C++ would have to be deleted in an explicitly defined destructor?

 

And once again, RAII rears its ugly err I mean pretty head.

 

Using smart pointers it's absolutely trivial to write C++ programs without a single explicit mention of operator delete.

 

But here you're relying on a library that gives you an object encapsulating a pointer plus (presumably fail safe) auto-destruct code. What you're doing is promoting pointers to first class members. That's kind of ass bakcwards, if you don't mind my saying so. If you want to hide pointers and all of their pitfalls under the hood, just have it as a natural function of the language. Oh wait -- that would break one of Saint Bjarnes arbitrary rules. You know -- the one about never curtailing programming options simply for something inane like, oh, convenience and productivity.

Link to comment
Share on other sites

See, I look at it from an entirely different perspective. I don't think I should have to invoke delete for every application resource I allocate with new, but I do like being forced to pay close attention to OS and network resources like file handles and database connections. It reflects a sense of priorities, IMO.

 

If more programmers made proper use of RAII, the world wouldn't be awash in programs full of resource leaks. Humans tend to forget stuff. Compilers normally don't. If you're insinuating that RAII encourages sloppy programming then I guess I don't know how to respond to that. :blink:

On the one hand you're telling me that C++ is too low-level for you, and then when you're shown a high-level feature that Java doesn't have but C++ does, you're telling me it's a bit too high-level for you?

 

 

But what useful object doesn't instantiate and use stuff that in C++ would have to be deleted in an explicitly defined destructor?

 

Didn't I just very clearly show you how to avoid having to clean up explicitly? Hint: instead of embedding a raw pointer to a new-allocated resource in your class, just use a smart pointer instead. Now you won't have to worry about manually deleting stuff. As an added bonus, the default copy constructor and copy assignment operator generated by the compiler now also have the correct semantics, which they wouldn't if you had used a raw pointer.

 

 

But here you're relying on a library that gives you an object encapsulating a pointer plus (presumably fail safe) auto-destruct code. What you're doing is promoting pointers to first class members. That's kind of ass bakcwards, if you don't mind my saying so. If you want to hide pointers and all of their pitfalls under the hood, just have it as a natural function of the language. Oh wait -- that would break one of Saint Bjarnes arbitrary rules. You know -- the one about never curtailing programming options simply for something inane like, oh, convenience and productivity.

 

I don't understand your objection. So what if I'm promoting pointers to first class members? Why is that "ass backwards" if it

  • works and
  • is lot simpler and safer than messing around with raw pointers, delete and custom copy constructors / destructors?

 

The way I see it, refusing to let the compiler help you out by doing something implicitly that you'd otherwise have to do explicitly (and very likely end up forgetting it in a few cases) is just foolish. The compiler is a heck of a lot less likely to make mistakes than you are.

Edited by Red Ant
Link to comment
Share on other sites

Guest aevans

The way I see it, refusing to let the compiler help you out by doing something implicitly that you'd otherwise have to do explicitly (and very likely end up forgetting it in a few cases) is just foolish. The compiler is a heck of a lot less likely to make mistakes than you are.

 

But that's my whole point: You're insisting that it's the bees knees to have to use a library to invoke that kind of compiler behavior; I get it automatically, by design, in the language I work with every day.

 

IOW, I don't have a problem with pointers being first class members. That's how C#, Java, and a lot of other languages work -- everything's an object, accessed by name. But bolting that kind of behavior onto a lower level language, while it may work, is just a bit retrograde, IMO. Either you use the lower level language because you need to work that close to the machine, or you use something that is more convenient and makes you more productive. YMMV.

Edited by aevans
Link to comment
Share on other sites

OMG. Cobol, fortran, took them both in college, then never, ever used them, so I could not write a line of either any more (not that I could do it very well back then either). Dabbled in ADA while in Germany, as well as BASIC on the IBM PCjr. SO what is the future of computing? Java?

 

Pascal! The Language of The Gods! :wub:^_^

Link to comment
Share on other sites

Guest aevans

Fucking programmers, go out and do some real engineering for once

 

If you recall, I've long maintained that programming is a craft, not an engineering discipline. But you know, where exactly would engineers be if they didn't have craftsmen to implement their big plans?

Link to comment
Share on other sites

But that's my whole point: You're insisting that it's the bees knees to have to use a library to invoke that kind of compiler behavior; I get it automatically, by design, in the language I work with every day.

 

IOW, I don't have a problem with pointers being first class members. That's how C#, Java, and a lot of other languages work -- everything's an object, accessed by name. But bolting that kind of behavior onto a lower level language, while it may work, is just a bit retrograde, IMO. Either you use the lower level language because you need to work that close to the machine, or you use something that is more convenient and makes you more productive. YMMV.

 

Look, what it boils down to is that C++ supports RAII (a highly useful technique if exception-safe programming means anything to you) and Java doesn't. I'm not too well-versed in C# but I hear C# does support RAII, so your point is valid for C++ vs C# but not for C++ vs Java.

Link to comment
Share on other sites

COBOL programmer, that was me. Most of the code I wrote over 25 years was in COBOL. We were trying to determine what we going to rewrite everthing for windows when I decided to get out. The thrill was gone. I wasn't going to write the stuff again in a different language.

 

That is the problem with COBOL. Who the heck wants to spend their career translating that to some other language? Plus most new programmers probably spent their time learning graphics. Graphics is nice, but it doesn't process vital information.

Link to comment
Share on other sites

Pascal! The Language of The Gods! :wub:^_^

 

ALGOL, the language of the One True God! ;)

 

Long long time ago I crossed paths with one of those crusty old farts who maintained that ALGOL was the One True Language (for scientific computing, anyway) and that the world was in a state of sin for having moved on to the languages of false gods, etc.

 

I dunno, I thought the state of sin was pretty enjoyable there for awhile (especially when Fortran 90 extensions became pretty universal in COTS compilers, preceding the acceptance of the actual standard as usual).

Link to comment
Share on other sites

Guest aevans

Look, what it boils down to is that C++ supports RAII (a highly useful technique if exception-safe programming means anything to you) and Java doesn't. I'm not too well-versed in C# but I hear C# does support RAII, so your point is valid for C++ vs C# but not for C++ vs Java.

 

Look, you're making a religious statement about a design decision. The reason Java doesn't support RAII is because Java is, by design, totally dependent on garbage collection. You may think that's a poor design decision, but that's all it is -- a design decision.

 

Your long excursion around the barn about how useful RAII is in C++ only applies to C++. In C#, you can let the garbage collector do its job, or explicitly call the Finalize() method of an object to release resources. In other languages, other patterns apply. Whether RAII is an explicitly available pattern is a design decision, and it's really not that big a deal.

Link to comment
Share on other sites

Guest aevans
Plus most new programmers probably spent their time learning graphics. Graphics is nice, but it doesn't process vital information.

 

BTDT. When I went back to school in my late thirties, it seemed like I was always swimming upstream because I wanted to work on business-related problems and not play around with pretty pictures. When senior project time came around, all of the kiddies wanted to build games, while I satisfied myself with a simple little MSDS management system. Except for one girl who was a programming goddess, all of the people working on games (some of them in teams) were about halfway done at the end of the semester. I completed my project with a week or two to spare and achieved all of my design objectives.

 

I knew I had done the right thing when I interviewed for a job -- I knew all about databases and business system design, while the kids who graduated with me went into interviews trying to sell graphics design skills to managers who needed somebody to build data-based web sites.

Edited by aevans
Link to comment
Share on other sites

Look, you're making a religious statement about a design decision. The reason Java doesn't support RAII is because Java is, by design, totally dependent on garbage collection. You may think that's a poor design decision, but that's all it is -- a design decision.

 

WTF has garbage collection got to do with anything? I feel like I'm talking to a wall. Garbage collection is of no use when you need a mutex unlocked, a file or network connection handle closed or something else done on leaving a certain scope. Having garbage collection is totally not an excuse for not implementing support for RAII. Python, for example, has garbage collection but still supports RAII.

 

Your long excursion around the barn about how useful RAII is in C++ only applies to C++. In C#, you can let the garbage collector do its job, or explicitly call the Finalize() method of an object to release resources. In other languages, other patterns apply. Whether RAII is an explicitly available pattern is a design decision, and it's really not that big a deal.

 

So what you're really saying is that in some languages you'll just have to make do with that feature? Duh! :rolleyes:

Link to comment
Share on other sites

Guest aevans

WTF has garbage collection got to do with anything? I feel like I'm talking to a wall. Garbage collection is of no use when you need a mutex unlocked, a file or network connection handle closed or something else done on leaving a certain scope. Having garbage collection is totally not an excuse for not implementing support for RAII. Python, for example, has garbage collection but still supports RAII.

 

Like I said, RAII is a religious issue for you. You don't have to rely on it to do any of the things you've described, since they can all be done explicitly. Even in Java you can write code to guarantee network and OS resource release, though you can't get the reference variables de-allocated right away. (Gotta wait for the garbage collector to do that.) There's no reason to insist on RAII except as a matter of religious conviction. Don't mean to hurt your feelings, those are the facts.

 

So what you're really saying is that in some languages you'll just have to make do with that feature? Duh!

 

I'm saying that regardless of the language, there are ways to implement desired or required design patterns. Only a religious fanatic insists that a language do it his way.

Link to comment
Share on other sites

Speaking of getting out COBOL, I thought about another reason. Back in the 2K era we had to fix the dates of all of our data bases. We had two digit years on our dates.

YYMMDD. But did not want to completely change a 2 million plus data base records in multiple cities where it was sorted by date. You would have to unload and completely reload the data base if you changed it to YYYYMMDD for the date. So we made little code changes where where the years after 99 were translated to A0 and the next year to A1 and so on. This might last until .... now. AA I wonder if multiple data bases run into a wall Jan 1 2011?

Edited by Mobius
Link to comment
Share on other sites

Like I said, RAII is a religious issue for you. You don't have to rely on it to do any of the things you've described, since they can all be done explicitly. Even in Java you can write code to guarantee network and OS resource release, though you can't get the reference variables de-allocated right away. (Gotta wait for the garbage collector to do that.) There's no reason to insist on RAII except as a matter of religious conviction. Don't mean to hurt your feelings, those are the facts.

 

The only way of doing that in Java is using those ugly, extremely elaborate try-catch-finally constructs (and you'll have to rethrow too, if you want to be exception-neutral) and doing the cleanup explicitly in the finally-block. Garbage collection doesn't help at all, since you aren't even guaranteed that your object's finalize() method gets executed at all.

 

I'm saying that regardless of the language, there are ways to implement desired or required design patterns. Only a religious fanatic insists that a language do it his way.

 

Yes, there are, and they are complicated and ugly as opposed to the neat simplicity of RAII. You must be the first programmer I've met who knows what RAII is but doesn't think not having it is a big deal. Your stance on this topic is kind of like saying "Oh we don't need no stinking functions and classes because we can just code in ASM." Sure, any problem that can be solved WITH a feature that makes life easier can also be solved without it - it'll just be a lot less comfortable.

Edited by Red Ant
Link to comment
Share on other sites

Guest aevans

The only way of doing that in Java is using those ugly, extremely elaborate try-catch-finally constructs (and you'll have to rethrow too, if you want to be exception-neutral) and doing the cleanup explicitly in the finally-block. Garbage collection doesn't help at all, since you aren't even guaranteed that your object's finalize() method gets executed at all.

 

You're missing the point. It can be done. Whether it's the way any particular person wants it to be done is totally irrelevant.

 

BTW, the reference to the garbage collector was in the interest of technical completeness. When an OS resource like a file handle, or a network resource like a database connection, is released, in garbage collected languages there's still a variable referencing the resource waiting for garbage collection. The file is closed, or the connection is returned to the pool, but the pointer to it has to go out of scope and be destroyed in the normal way.

 

Yes, there are, and they are complicated and ugly as opposed to the neat simplicity of RAII. You must be the first programmer I've met who knows what RAII is but doesn't think not having it is a big deal. Your stance on this topic is kind of like saying "Oh we don't need no stinking functions and classes because we can just code in ASM." Sure, any problem that can be solved WITH a feature that makes life easier can also be solved without it - it'll just be a lot less comfortable.

 

Do you write software for a living? Because people that do it for a living generally don't engage in religious evangelism like you are engaged in. They know how to program, period, and just make do with whatever language the boss or the project requires.

Edited by aevans
Link to comment
Share on other sites

Do you write software for a living? Because people that do it for a living generally don't engage in religious evangelism like you are engaged in. They know how to program, period, and just make do with whatever language the boss or the project requires.

That's for sure. It's not usually the jazziness of the language but other factors. Can it be maintained in the future? Can they find other programmers in the future if these leave? Will the software company be around and support the language in the years ahead? One reason our company went with MS c# instead of Delphi for some windows interface products is that management thought Borland might die. Plus MS knows shortcuts in their Windows API and 3rd party software often have to go through published routines.

Edited by Mobius
Link to comment
Share on other sites

You're missing the point. It can be done. Whether it's the way any particular person wants it to be done is totally irrelevant.

 

You're making it seem as if __I__'m the one who started this endless debate when it was in fact you that took me up on a point I made about Java, the supposed C++ killer. I simply stated that C++ does RAII while Java doesn't, to which you responded with some irrelevant talk about garbage collection and the following statement:

 

But I can appreciate the convenience and power of Java-like languages, like the C# I write in every day.

 

which is ironic, given that RAII is both a convenient and powerful tool ... that Java doesn't have. Anyway, to get back to the point I'm not saying that C++ is the one true language for every task or even that you shouldn't use Java; all I'm saying is that for a supposed C++ killer language to lack such a fundamentally useful feature is certainly a disappointment.

 

BTW, the reference to the garbage collector was in the interest of technical completeness. When an OS resource like a file handle, or a network resource like a database connection, is released, in garbage collected languages there's still a variable referencing the resource waiting for garbage collection. The file is closed, or the connection is returned to the pool, but the pointer to it has to go out of scope and be destroyed in the normal way.

 

Not in Java, nope. The memory of the class object representing a database connection will be properly freed by the garbage collector, but the actual database connection will not necessarily get closed because that would require your finalize() method to run, and the garbage collector gives you ZIP guarantees as to WHEN or even IF that will happen. Linky -> Practical Java Praxis 67: Do Not Rely on finalize Methods for Nonmemory Resource Cleanup

 

 

Do you write software for a living? Because people that do it for a living generally don't engage in religious evangelism like you are engaged in. They know how to program, period, and just make do with whatever language the boss or the project requires.

 

Actually yes, I do. And the software we write is safety-critical, which is why mechanisms that eliminate or reduce potential sources of errors are extremely important to us. Apart from the fact that eliminating error sources should be a prime goal of ANY programmer worth his salt.

Link to comment
Share on other sites

Guest aevans

You're making it seem as if __I__'m the one who started this endless debate when it was in fact you that took me up on a point I made about Java, the supposed C++ killer. I simply stated that C++ does RAII while Java doesn't, to which you responded with some irrelevant talk about garbage collection and the following statement:

 

I wasn't talking about RAII. I was talking about the language in general. If I didn't make that clear, I apologize.

 

which is ironic, given that RAII is both a convenient and powerful tool ... that Java doesn't have. Anyway, to get back to the point I'm not saying that C++ is the one true language for every task or even that you shouldn't use Java; all I'm saying is that for a supposed C++ killer language to lack such a fundamentally useful feature is certainly a disappointment.

 

You said Java was a "C++ killer". You may have heard that somewhere, but I never have, and I certainly wouldn't agree with such an obvious absurdity. You, my friend, have created a false dilemma.

 

Having said that, all of the memory managed languages taken together (Java, C#, PHP, Perl, Ruby, etc.) have really cut into the potential C/C++ code inventory, because they do what many programmers need them to do, without requiring programmers to mess with stuff they don't need to mess with.

 

Not in Java, nope. The memory of the class object representing a database connection will be properly freed by the garbage collector, but the actual database connection will not necessarily get closed because that would require your finalize() method to run, and the garbage collector gives you ZIP guarantees as to WHEN or even IF that will happen. Linky -> Practical Java Praxis 67: Do Not Rely on finalize Methods for Nonmemory Resource Cleanup

 

Thanks for linking to an article describing what I was talking about. Now go back and re-read what I wrote.

 

Did you re-read it? Jeepers -- how dumb do you think I am!? Of course you can't rely on the garbage collector to run Finalize(). That's why you use try-catch-finally. And that's precisely what I meant when I wrote: "Even in Java you can write code to guarantee network and OS resource release, though you can't get the reference variables de-allocated right away." IOW, there are well known and easy to implement design patterns for resource control, even if they don't rely on the holy RAII.

 

Actually yes, I do. And the software we write is safety-critical, which is why mechanisms that eliminate or reduce potential sources of errors are extremely important to us. Apart from the fact that eliminating error sources should be a prime goal of ANY programmer worth his salt.

 

Since you want to go there, any programmer "worth his salt" knows that any desired but non-existent language feature can be synthesised. The smart pointer you demonstrated earlier is a pointer synthesized into an object through encapsulation. By the same token, safe resource management can be synthesized by writing the necessary logic into wrapper code encapsulating an unsafe class in a safe one. No big deal -- it's done every day.

 

WRT your professional endeavors, here's a piece of unsolicited but valuable advice: don't fall in love with any particularl language, much less any particular language feature. You're going to have to say goodbye to your present job someday, and if you insist on working within a narrow selection of technologies, you're just making yourself unmarketable.

Edited by aevans
Link to comment
Share on other sites

Programming languages are like construction materials... I feel like I am watching a group of masons and carpenters argue over which material to build a structure with....

 

The large company I work for still builds NEW COBOL programs for a niche of programming challenges... if Pascal or Fortran were still the best of breed for another niche, we would use them too... If you want to support high speed transaction processing and you have infrastructure in the way of a standing cadre of COBOL programmers, VSAM data and IBM big iron sitting in your data center, COBOL still makes sense.

 

FYI - We are also migrating some systems to RAM resident data fabric, SOA and fancy pseudo 4-5GL UI builders for other solutions....so the proverbial CIO/CTO head is not entirely sand immersed.

 

You use what makes sense....arguing over which is better for everything is something only silly programmers do :P

Edited by medicjim86
Link to comment
Share on other sites

Guest aevans
You use what makes sense....arguing over which is better for everything is something only silly programmers do

 

Agree 100%.

Link to comment
Share on other sites

Are you talking about the ISPF editor?

 

JCL - Job Control language - its the bit that went onto the 50 column cards and 'told" the mainframe how to actually "run" the COBOL code (source data device(s), code, out put method and device, etc)while passing it variables like date/time. Very simple, very dense language.

 

I had just come into IT direct with no training. This was my first task on day 1 - run an offsite pension run (simulate a disaster recovery scenario) for a rather large Oz Government Department.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...