Describe me the computer programmer of the future

One of the side effects of 4GL's and other developments that make programming easier is that you will see a greater span between the lay programmer and the technical specialist. Standard business applications will be easier to develop and maintain, but will be more cookie-cutter and not require a lot of thought. Therefore, technical ability and analytical skills will not be as required. This could have an impact as pay will decline, the talent will decline, the applicants will increase and it will be easier to import work from other countries.

On the other hand, as the number of good paying jobs goes down and educational institutes crank out fewer applicants, those with real skill and knowledge will be in more demand. Though fewer in number, these technical "gurus" will be highly paid and have much security. Basically, the reduction in necessary skills will rewind the clock on the industry and those with real skill will resemble the IT workforce of years past.

This is only my opinion and may not actually come to pass, but I have already seen deteriorating skills with programmers and cookie cutter approaches to applications. This is good for corporations, as they can retain a more affordable staff and only have to pay a small number of specialists. This is a bit frightening for the average programmer, though (especially those who are unemployed and trying to find a job without taking too much of a pay cut). For the most part, we have been able to ride a pretty good wave over the past decade (unfortunately, I came in a bit late and haven't been able to ride it as high as some of my peers).

I do not know what emotional tone that your hero will have, but if you want something a bit more cynical/arrogant/desperate, I'm sure that you can extrapolate from these posts (everyone has made some good/gloomy comments) what the future may hold for a programmer.

Remember, a niche is a great place to be until you look around and find that everyone has found a way to fit in your niche.
 

log in or register to remove this ad

I'd strongly reconsider using virtual reality of any sort. I don't mean to be snippy, but using VR for the kinds of things you're talking about is something like flying a helicopter to your neighbor's house. Maybe you could do it, but you're wasting resources and adding needless complexity. Virtual Reality in the near future will only be used in specialized applications, like sugery and eventually entertainment.

Turanil said:
He would be assisted by some lesser AI, that would appear like a sort of "wizard familiar" in the virtual world. Then, his interventions on the virtual world could appear as if he was a D&D Illusionist, so to speak (being able to change things that would affect the virtual reality locally).

Turanil said:
Our world is now a mixture of real and virtual world, as people are almost always connected to the "infosphere" (the encompassing Internet of the future).

I'm not entirely sure I understand what you're going for. On one hand you say the world is "mixed", but on the other, you talk about his powers in the virtual world, implying that there is a separation.

Now, the mixture part makes me think of so-called "augmented reality", which is supposed to be the Next Big Thing (and probably won't be). In augmented reality, people wear glasses or have optical implants that overlay information onto what they're seeing. For example, if you're looking at a restaurant, the computer could show you reviews or maybe even tell you how long the wait is. Or, while you're driving, it could show you where to turn with arrows, display your speed, etc. This sort of thing is also generally implausible. Computers are still pretty bad at recognizing images, for one. However, there's less fiction about it than about virtual reality (at least, I think so). You could say that your programmer cracks into the system and causes graphics to appear over his face so nobody can recognize him. That's also pretty ridiculous, but no more so than virtual reality, IMO.

In reality, the future of computing is unlikely to be nearly as different or as flashy as most people seem to think. And, as others have mentioned, the future likely brings more advanced programming tools and less advanced programmers (save a small subset).

However, I have a problem: I know nothing about computers beyond using a few softwares and posting on Internet forums. I know nothing about computer programming. So I would be glad to get your help here, as several of you are computer professionals: how to portray a convincing programmer guy of the future in a novel? (preferably without using technical descriptions) Any suggestions?

Don't describe how anything works. If you have to explain some technology, do so at as high (abstract) a level as possible. Use jargon sparingly. Throwing it around might fool some people, but I can't tell you how many novels I've read where the author made it PAINFULLY obvious he had no idea what he was talking about. Especially don't have the programmer use jargon metaphorically to talk about real life unless you've run it by a programmer to make sure it's reasonable. A real programmer is much more likely to use real life metaphorically to talk about programming.
 

There are useful suggestion here. Particularly, something about a trend I seem to also discern in other professions. With advancement of technology and computers, there is less well paid work requiring skills for the average people; only a minority of talented persons will have access to well paid jobs, while the big majority will be poor.

This is a sci-fi novel with no indication of when it happen (21st or 23rd century). I want it with virtual and enhanced reality being pervasive. In this world almost everybody has a flood of nanobots into the brain (or a microchip) that helps connect directly onto the virtual reality, so people are always somewhere between the real world and total hallucination. All my story builds on that in fact. I just have to make the reader accept it as much as he would accept magic in a fantasy novel.
 

WayneLigon said:
Maybe not worry too much about not knowing too much about computers: William Gibson, the 'father' of cyberpunk didn't know diddly about computers. He got his ideas sitting in the open air cafes and listening to the drug runners talk; the cadence and rhytem of speech, the personalities.. these are the important things. Then he wrapped that in a hypothesized future construct.

*nod* Any sufficiently advanced technolgoy is indistinguishible from magic. And unless you really know what you're about, trying to tell the audience how the magic works will come out as technobabble that'll bore your audience. Or, worse yet, cheese off those members of the audience who feel they know what computers are "really" like...
 

The biggest thing I would want if I had VR now as a programmer would be a VR software engineering environment instead of one on screen - which I don't see as a big change. Only a change of representation rather than a change of actual coding. Bascially I can "physically" touch, move, and manipulate my objects rather than depending on keyboard/mouse intermediaries as much. I'd like to speak rather than type. Bascially I'd like to make coding a little more "hands-on" like carpentry. But that's just me - linguistics and visual/spacial are my stong suits. Another person who likes to interact with the world a different way would likely say something else. And maybe that's the key. Perhaps a more immersive VR world would allow us to each interact with the world more in the way we prefer to - we still have the same thoughts and concepts, we just interface with the world differently. The more the world changes, the more it stays the same.
 

Programming software is always a deal between at least two (but often more) parts. You always have a programmer, and an expert in the subject the program will help with. Sometimes the programmer is the expert, but this is rare in bussiness environments. Between programmer and expert there might be an analyst, who interviews the expert and tries to grasp what might be called the "Bussiness logic", and all knowledge of the subject needed to make the program. An analyst then lays down this knowledge in a "model", that represents the problem in a "language" (diagrams, explanations, etc.) that might be understood either by a "software designer" or the programmer himself (if he's doing the designing). The analyst and designer might be the same person as well, though analysis and design are tasks to be done separately.

In what's called "Machine language programming", the programmer knows the in and outs of the specific computer hardware he's writing for, and gives more or less direct instructions to this hardware. At this level programming is hard, and requires thinking in terms of computer concepts, and not bussiness concepts. The programmer is king, because the analyst or client (and probably the designer as well) won't have a clue of what the program does or how. At this level, it's usual for other programmers to really have to strain to understand what the program does as well.

Then there's "Low level programming", which shields the programmer from the hardware, but still makes him program using pure programming concepts. Programs are more "readable", but beyond the grasp of an analyst (or designer, perhaps). It's easier to program at this level, so bigger and more complex programs are possible.

In a "High level programming language" the line between programmer and designer breaks, since programming is done in "High level concepts" (objects, bussiness rules, logical prepositions, etc.) High level programs should be more readable and very easy to understand by peers. Programming becomes very fast if the right language is chosen for the task (some high level languages are general, but some are very specific). These languages are more complex internally (since there's an intermediate level at which high level concepts are converted to machine instructions), and require extra computing power... so though they have existed for years, they weren't always used.

The future should see a trend to higher level languages at which a designer/programmer is no longer needed, and an analyst can work by himself. "Expressing the solution to a problem" should become a synonim with "Programming"; i.e. if you can precisely explain what you need, you can have it. These languages will rely on internal AI to make assumptions and choose solutions, and have huge libraries of small reusable programs (the LEGO block approach mentioned).
 

Take a look at

Croquet

and

Alice

These will probably illustrate how non-programmers (at least in the traditional sense) will be able to manipulate a 3D world some years from now.
 

Pets & Sidekicks

Remove ads

Top