Baptist Press - WORLDVIEW: The Word and the word - News with a Christian Perspective
The implications of this article struck me as fascinating because it is something I have been thinking a lot about lately as a writer. And as someone interested in communications, codes, crypts, and homiletics.
I've also been pondering on the questions of, how is this gonna change information systems, how is this gonna change communications and communications systems, how will this change language itself, and how will this change codes and encryptions? (Assuming these implications are true, and I suspect they probably are. Language nowadays is much more primitive in many respects - among many people a much less well-developed and imprecise personal vocabulary for just one example - than that of our ancestors, and yet in some respects also far more complex than ever before with terminology flexibility and neo-logism creation at probably an all time high. I'm also beginning to believe that terms are becoming ever more plastic, as well as "implicational, metaphoric, or hymnological" in usage than perhaps at any other time in history.)
in any case if language is changing in this way, either through interactivity with technological implements and systems (such as information and communication devices, and with the internet), or through linguistic exchange (or through other methods) then how can these alterations and developments be best exploited to create new communicative capabilities? Perhaps even new linguistic functions.
For instance can words themselves contain within them microbursts of related data and information? For instance, could one compose language in such a way that a two or three word sentence might contain both the (extended) vocabulary, and the accumulated information of a much more complicated (linguistically speaking) sentence? Could a single word perhaps encapsulate an entire sentence or phrase, and still be able to imply some real degree of useful precision?
Another thing that occurs to me, is can the alphabetic nature of our language be (re)transformed into an alphanumeric form, such as was the case with ancient Hebrew and Greek (Greek is perfect, Latin is copious, as the old saying goes), and is one way of reducing terminology to "microbursts of inter-related data" via the conversion of words into an alpha-numeric form?
Conciseness (Greek perfection) and Preciseness (Latin copiousness) are often competing modes of communication, and necessarily so. But is it possible to develop an "unfixed" oral form of modern microburst shorthand (digitized idea forms) which still contains within it a much more complicated set of terminologies and related data and information forms that could, if necessary, be used to convert an oral microburst into a far more complicated and precise form of information exchange (maybe in text, maybe not)? I suspect that if you encoded within the short digitized idea forms a specific pattern set of analogical terms and ideas (yeah, I know the process is usually undertaken in reverse, with the attempt to take complicated analogical terms and develop a simplified digital code for process transfer and language function) that it could be done. Next week I think I'm gonna try and write a short essay on the subject and then later one write a Theory Paper on the possibility, and how it might work. (Right now I've gotta finish up some other work. But next week I'll be free.)
Maybe I'll also incorporate the poetry coding experiments I'm working on into the body of the piece (at least the Theory Paper) to see if they could serve as one possible example of how it might be undertaken.
I'm also wondering if the same general principles might apply to the writing of Sermons and to homiletic construction, and maybe even to this project:
The Holiconic Impulse
It's something I haven't worked on in awhile, because I was waiting to set up some new things in my lab, but maybe I can experiment with the idea of interjecting within the holographic image either microtext versions of scriptural verses or maybe even encoded textual components within the visual images reproduced within the graphic representations of the holograph itself. (An idea I'd like to pursue one day is the idea of maintaining holographic integrity at the visual level while still being able to sub-fracture other embedded components, such as linguistic components. In that way you could create a routed holographic encoding system which would appear as a normal holographic image under general or normal observational circumstances, but contained within it are sub-fractured stenographic, lingisuistic and/or other forms of coding and creeping structures viewable under special circumstances.)
One last thing occurred to me. Suppose you could develop an oral language that contained within it (inherently) both microburst data clusters, and automatically re-arranging (or self-arranging) terminologies that are multi-implicational? The idea would be to create an oral language that is both concise and precise, and rearranges itself automatically according to the needs of the communicants. How the language is being used would dethrone whether precision or concision were being empathized, and in what way and to what end.
Anyway, if anyone else is interested in these matters I'd be happy to listen to your ideas, thoughts, musings, and observations on the subject.