Describe me the computer programmer of the future

Turanil

First Post
Well, I have mentioned it before, but now I have begun the real work: writing a sci-fi novel! So, this sci-fi novel takes place in a relatively near future, on Earth. Our world is now a mixture of real and virtual world, as people are almost always connected to the "infosphere" (the encompassing Internet of the future).

My hero is a professional in the computer industry. His work has something to do with software. I want him to have his work related to virtual reality. He would be assisted by some lesser AI, that would appear like a sort of "wizard familiar" in the virtual world. Then, his interventions on the virtual world could appear as if he was a D&D Illusionist, so to speak (being able to change things that would affect the virtual reality locally). These would be his (limited) special powers.

However, I have a problem: I know nothing about computers beyond using a few softwares and posting on Internet forums. I know nothing about computer programming. So I would be glad to get your help here, as several of you are computer professionals: how to portray a convincing programmer guy of the future in a novel? (preferably without using technical descriptions) Any suggestions?

Thanks.
 

log in or register to remove this ad

Well, since programmers write programs to perform a function, this guy would do that. These functions can vary from business/accounting type reports, database manipulation, scientific calculations, games, education, etc. (as you are aware, the list is huge).

Also, programmers rarely write more than they have to, instead utilizing existing code that performs the necessary function (or comes close) and modifying said code for the specific need at hand.

Programmers need to analyze, plan, code and test (aside from meetings, documentation, surfing the web...er...shhh) their programs.

Now, take these tasks and move them into today's world of PC programming (I am a mainframe business programmer) and existing objects and modules (reusable pieces of code that programmers use to build their programs). Look at the plugins and such that you can use in you desktop software or web pages. This programmer would probably manipulate existing pieces of code and add the necessary "glue" to make the program work.

Finally, let's move into a future world of VR and AI. Now, we have a programmer that works in a visual/virtual environment, creating programs with the help of intelligent computers. Debugging may occur at the same time the programmer is building the program ("I'm sorry Mr. Smith, that will invariably result in an infinite loop, please adjust your code"). The virtual environment would allow for the "virtual programmer" to "walk" amongst his code and move things about "physically".

That's my take on how things might be for the sci-fi programmer of the future (sorry about being somewhat vague and not elaborating on some of the ideas - I'm in a hurry)...
 

Thanks for the suggestions. This gives me another question to ask: about terminology that sounds appropriate to use for the programmer doing his job. Any nice words to list?
 


I don't know what would be appropriate (there are books upon books of this type of stuff - as well as websites), but here is some jargon (I don't what your level of knowledge is, so some of this may be elementary - also, I am using my memory for definitions so I may not be textbook - I'll also keep it somewhat simplistic and not use too much techno-babble, as the jargon of the future may be substantially different, even if the meanings remain):

Object - a piece of code designed to do a single function (often combined with other objects to create a program).

Module - an object or group of objects that perform a function (many times a compound function, made up of multiple objects - may be a stand alone program or another piece to a program).

Syntax - the text and format of a programming language (by this, I mean the correct usage of a programming language's verbs, commands, variables, etc. - this does not guarantee a program will function as planned, only that it will be understood by the computer).

Source Code - the human readable representation of the program (words like if, when, until, +, -, =, etc. are easier to read and understand than 1's and 0's).

Object Code - the machine readable representation of the program (machine language is discussed below, but basically this is what the computer actually executes).

Recursion - the technique of having an object perform itself (being a mainframe program, in which recursion is often frowned upon, an example is not presenting itself right now - perhaps I'll edit later or someone else will chime in).

Flowchart - a pictoral symbolization of a program or system of programs.

Variable - a programmer defined piece of the program used to store a value (also called Fields, these are used as holders for totals, counters, quantities, names, addresses, etc. - anything that may not be static throughout the execution of a program, as records from an employee file are read in, the name field will change for each employee).

Loop - a programming technique that allows a piece of code to automatically be executed over and over (hopefully until a condition is met that will end the loop - otherwise, the loop is known as an infinite loop and the program will either automatically terminate or need to be manually terminated).

Iteration - one time through a loop (one of a computer's strengths is the ability to perform monotonous iterative tasks faster than human beings).

Client-Server - the technique of a single computer (the server) executing a program that is then utilized by multiple people at "client" workstation (a mainframe is the perfect example of this - at a bank, the customer account program will be executed by the computer in the back room or at another building, while the teller will see the results on his/her screen).

Multitasking - the technique of a computer running multiple programs at the same time (mainframe computers and other multi-processor computers can accomplish this via the use of multiple processing chips to perform the functions - Windows-style PCs use a prioritizing/swapping/timesharing method that closely resembles multitasking - I think that this is still the case).

Debugging - the technique of finding a problem (or bug) in a program by analyzing computer generated messages, program results and memory dumps (a picture of the contents of the program's space in memory at the time of failure) - computers of the future should have advanced self-debugging capabilities.

Machine Language - 1's and 0's - the language of the computer (on's and off's that the computer recognizes and uses to manipulate internal registers that are used to perform the functions it is programmed for).

Binary - a base 2 numbering system (what machine language is made up - where the 1's and 0's are in the program is what tells the computer what to do).

Hexadecimal - a base 16 numbering system that is also an extrapolation of the Binary numbering system and the Machine Language (the computer, for the most part, deals in chunks of binary code - bits and bytes).

Bit - a 0 or 1 (the smallest form of data understandable by the computer).

Byte - 8 bits (the 8 bits are divided into two halves of 4 bits which total 15 when all are turned on, hence the base 16 numbering system 0 - F, it's easier for programmers to deal with) - since it is so ingrained, it will probably still be used by computer programmers of the future (just not to the same extent, as even today programmers are less likely to understand Hexadecimal than they needed to be 10 years ago).

Abend - abnormal end of a program (this is generally a bad thing - like those times when a program crashes and you get to look at some stuff and decide if you want to send it to Microsoft).

Dump - a picture of what was in the computers memory at the time of an abend (this is the stuff that you can look at before sending the report to Microsoft - I don't know how complete that dump is, however).


Okay, that's all I have for now. I hope that this helps.
 

I think FickleGM has covered a lot of the basics.

The trend in programming is a move more towards canned software packages for development. A Lot of shops (programming speak for IT departments) use a lot of so-called "4GLs" (4th generation languages) for development nowadays. Visual Basic could be considered a 4GL since it relies on all the Windows API (application program interface) to do all its work. I work with databases (specifically data warehouses) a lot, and the trend is to use high-level languages like Ab Initio and Informatica to do a lot of their work. These programs allow programmers to move little boxes of object code around on a screen and give them basic instructions. So, writing programs is sort of like pseudo-code. That is, "take this file, strip out these columns, sort it on these fields, merge it in with these other two files; sum up these numbers, put those results in this database table, isolate certain records and put them in a report, etc., etc."

Understand that virtual reality is a complex, CPU-intensive process. VR has a lot of applications in the medical field, pornography (this industry will lead the way in VR research, mark my words), and entertainment. But loading data into a database, and generating the accounts payable report WILL NOT use virtual reality. This stuff is essentially numbers and just raw data figures.

It is possible that the PC of the future might utilize some sort of VR goggles to make using the PC easier. Imagine if a cubicle could be much small, and you give an employee a single swivel chair, and some wireless VR goggles (that really need to be wireless to be effective). His "desktop" could be a 360 degree environment, and his mouse could just be his hand (with a specialized glove). Coding in this environment might look something like dragging icons from the desktop into a document, like drag drop.

But this desktop would be the internet. So, the programmer might say "I need a quicksort routine", and it would place an icon in his area, and he could pick it up, and place it into his module. So, programming is essentially steps - do this, this, and then this. Each of these are basically modules.

I see programming like mathematics in that a lot of people nowadays are reliant on calculators - you still have to know the steps, but I don't need to know how cosine is calculated, I just need to know how to use it to find the length of the side.
 

By the time we can develop a true artificial intelligence - by that, I mean seperate autonomous self-aware consciousness - we'll have moved beyond programming. The AI can learn and program way faster than any human ever could; it will live years every day, learning by experience all that time. AI's will then build their successors, since no human will be able to grasp the complexity.

Since it's near future, I suggest it not be a true AI. Look up the terms 'fuzzy computing' and 'quantum computing' for some near-future ideas. Things like that will render most current terminology obsolete.

Maybe not worry too much about not knowing too much about computers: William Gibson, the 'father' of cyberpunk didn't know diddly about computers. He got his ideas sitting in the open air cafes and listening to the drug runners talk; the cadence and rhytem of speech, the personalities.. these are the important things. Then he wrapped that in a hypothesized future construct.

I'd also think you want to look long and hard at the 'virtual reality' idea, and the 'ai as familiar' idea unless you can bring something truly fresh and amazing in the terms of characterization. Those things were done to death in the early cyberpunk offshoots in the early and mid-Eighties.

Another idea (espounced by my college computer science teacher - how sadly wrong wrong wrong he was) is that by the time of your novel.. programming is out. Nobody bothers with it anymore, because computers are so ubiquitous that all the basic structure is already written and in place. There's only so many ways you can write an accounting program. Only so many ways you can write an inventory system. Programs are simply 'put together' like LEGO blocks. Computer programming is a dead-end job, like being a waitress or warehouse loader. They're like data entry people were; it's probably just a temp position, so easy that children often use it as their first 'real job' that they quickly move out of once they form the contacts and skills needed. 'Can I build you an accounting system?' is the 'do you want fries with that?' of 2150.

Your hero could be a kind of 'cyber-psychologist' diagnosing expert systems or primitive 'AI's, to find out what went wrong. Maybe he's a quality control expert. Maybe he's an artist who designs user interfaces.
 

FickleGM said:
Okay, that's all I have for now. I hope that this helps.
It helps for sure! Much thanks. :)

WayneLigon said:
Your hero could be a kind of 'cyber-psychologist' diagnosing expert systems or primitive 'AI's, to find out what went wrong. Maybe he's a quality control expert. Maybe he's an artist who designs user interfaces.
I like these idea. I will probably built on that.

WayneLigon said:
I'd also think you want to look long and hard at the 'virtual reality' idea, and the 'ai as familiar' idea unless you can bring something truly fresh and amazing in the terms of characterization. Those things were done to death in the early cyberpunk offshoots in the early and mid-Eighties.
Mmmmh... that's the problem. To tell the truth, the only cyberpunk novel I ever read was Neuromancer. If all other cyberpunk novels were like that, mine should be a little different. My vision of the future is based more on Ray Kurtzweil views ("The Singularity is Near") than on previous cyberpunk novels. In any case, I think it's really difficult to make anything new and different.
 

Cyberpunk has been done to death, and is no more exciting to read. Anyway, what is cyberpunk? I put a list below of what I think of, when I hear about cyberpunk:

-- Stories generally feature/involve borderline types, punks, criminals, drug dealers, detectives, etc. operating in the underground.
-- Stories happen in dark megacities such as a cold and dirty New-York, Tokyo, or what not.
-- Mega-corporations tend to be "evil".
-- Virtual Reality is accessed through jacking into it, so you are either inside or outside the virtual world.

Well, if this defines most of the cyberpunk genre, I think my current idea remains out of it. Nonetheless feel free to add any cyberpunk cliché that I have forgotten in my list.
 

Programming issues have already been covered, but I'll chip in what I think we can safely forsee in the future.

Worse programming, better hardware

I see it all the time. People write sloppy code and reccomend hardware upgrades to correct their problems instead of coding it correctly the first time. Devices will be shipped with a bulky operating system that runs a small application because that's faster and easier to do than to tailor a given program to a device.

No more pioneers

The days of the solo programmer are over. I had a hard time getting a job in my last search, in part because I pretty much worked alone. Programming is very much a team oriented event. You will still get some exceptional cases, but what they build will either be something easily available in a commercial product or tailored to something incredibly specific that only a small community will care.

Increased piracy

Most of what programming has become is starting with a package that's maybe 60% of what you want and building on that. Grabbing code of the internet is so common people don't even think about it anymore. You just include the original programmer's name in the comments section if you are honest or remove it if you are not. This trend will continue as entire software packages will be placed on the internet. Rather than understand how they work, programmers will just toss the entire package in and see what happens. See "worse programming, better hardware" above.

Continuation of the contemporary interface.

Virtual reality? Cybernetic brain implants? No. Maybe better voice recognition software, but that's the best you can hope for.


Basically, programmers will have less understanding of what they are doing as more emphasis is placed on how the application looks than how it works. If this is downbeat, the bright side is that more applications will be written and even people with little understaning of programming will be able to throw something together for fun and entertainment (like the Neverwinter Nights fan-based modules).
 

Pets & Sidekicks

Remove ads

Top