Net Results: I always think of computer scientists as being another name for computer programmers.
But according to Barry Boehm, professor of software engineering at the University of Southern California, programming is fast becoming one of the least important skills of the computer scientist.
This curious state is among others he mulls over as he organises a large software processing Master's degree programme at USC, preparing 500 students for a future working with software.
"What we like to do is anticipate the kind of thing they're going to have to deal with in the future," he told an audience last Friday at a Dublin workshop and joint meeting of the International Process Research Consortium and the new Irish Software Engineering Research Consortium (ISERC). "We want to teach them the processes of the future, and not the processes of the past."
Prof Boehm, whose career has spanned work with US think tank RAND, the Defence Advance Research Projects Agency (Darpa, out of which the internet emerged) and in academia, is the author of several iconic computer textbooks and software development theories and is recognised as a key influencer in the discipline of computer science.
One of his early perceptions - that software would quickly become the central cost in computing systems, not the hardware - caused Darpa to rethink the direction of computer research and investment and helped reshape the emerging computing industry.
Therefore, he's a man whose point of view on software and computing education is particularly worth listening to, and his relaxed, lively speaking style makes it a pleasure to hear.
In a discussion of the seven "surprise-free" trends in computing that will shape computing's future - and two extra "wild card" trends whose impact may or may not be significant - Boehm firmly demoted programming from being the main skill of the software specialist.
"One thing we're finding is it's less and less important to teach people programming anymore," he says. "Now, people need to focus on understanding systems engineering [putting together large computing systems] and knowing how to evaluate off-the-shelf programs."
It makes sense - as the industry and discipline has grown more complex, there's less need for people to know how build a piece of software themselves. While there obviously will always be jobs for dedicated programmers to create the software that is purchased off the shelf and fitted together with other systems software in an organisation, there's less demand for lots of in-house programmers writing bits of programmes or bespoke software (and we all saw what headaches some of those old-time approaches created in the run-up to Y2K, when millions of lines of code had to be painstakingly checked - often by old-time programmers called out of retirement).
I was intrigued by some of Prof Boehm's other points too - far from being "surprise-free" for me, they raised some fascinating software issues that I'd never thought about before.
Among them is the cultural aspect of software. "What works for one culture, really turns another culture off," says Prof Boehm.
He talked about some of the social science concepts on classifying trends within a culture - some cultures are group-focused, some individual-focused; some value the achievement of short-term goals, some are flexible about deadlines and prefer to get the longer-term picture right.
Then he set this in the context of creating software or implementing large-scale software projects, taking as his example a Thai graduate student at USC who learned American methods and tried to bring back what he learned to Thailand. He quickly found the companies there couldn't implement the processes he'd learned - the method was individual-focused, and valued the meeting of short-term goals. Thais tend to be more group-oriented, and weren't keen on forcing quick results, he says.
Another important trend is towards continuous learning - which seems fairly obvious, but Prof Boehm has an interesting take here, too.
"What we're focusing on is for students not to learn about stuff, but to learn about how to learn about stuff."
Computer scientists will increasingly have to take in information from a range of sources to solve problems, so Prof Boehm likes to give students problems where they have to work out an answer by doing research, not writing code.
Of course the really interesting bit of the talk was about the wild cards, the trends whose impacts are really hard to predict. Prof Boehm thinks there are two: autonomic computing - the ability for software to think for itself, partly because humans are too slow to react to changes in big, automated systems; and what he calls "bio-computing", the merging of biological processes and computing.
"There's a great potential for good, but also a great potential for harm - new failure modes for computing, the loss of human primacy, even the overpowerment of humans."
And just when it was getting good - that was it. But a good speaker leaves you wanting more - and thinking well into the future about what was said.
weblog: http://weblog.techno-culture.com