• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

The History of Computers

A decade ago? Not really, 10 years ago what "MOST" people used they are still using. The "backbone" may have moved from Fiber to "WiFiber" and they may be storing their "stuff" in the "cloud" but basic "internet" hasn't changed much in the last 10 years other than it is "better". Better means faster, easier, more storage, but the basics … existed. There were LOT's of message/discussion boards 10 years ago … and before. Of course there has been improvements, but to see the real difference between the past and now … you have to go back more than 10 years.

And I am very aware of that Professor Corey. And what they completely miss in this is the fact that before then things changed very drastically and very frequently.

And yes, there were boards around then. But they do not really look like they do today.

At that time, they look a lot more like 4chan does than Debate Politics. And that is because the creators tended to have been users of older systems like FIDO or Usenet so that is what they copied. Systems like this one are relatively new, only becoming the standard in the last decade or so.

Forums are far from "new", I was using similar systems 30 years ago. But they do not look like what we are using here.

Consider it as the difference between writing notes on pieces of bark, then the development of moveable type, then digital publishing. Yes, the basics are the same but in more ways they are drastically different.
 
And I am very aware of that Professor Corey. And what they completely miss in this is the fact that before then things changed very drastically and very frequently.

And yes, there were boards around then. But they do not really look like they do today.

At that time, they look a lot more like 4chan does than Debate Politics. And that is because the creators tended to have been users of older systems like FIDO or Usenet so that is what they copied. Systems like this one are relatively new, only becoming the standard in the last decade or so.

Forums are far from "new", I was using similar systems 30 years ago. But they do not look like what we are using here.

Consider it as the difference between writing notes on pieces of bark, then the development of moveable type, then digital publishing. Yes, the basics are the same but in more ways they are drastically different.

Whatever dude! Ten years ago was 2009, things haven't changed that much.<-Period

IMHO, you have to go to 1999 and before to see real differences. Other than speed and graphics, it was all there in 2009. I worked in the business (network engineer) and retired, recently. Your gaming examples go back way more than 2009. I played MUDs, UO was the first, really successful, GRAPHICAL MUD, that goes back to 1997 and it was much more advanced than any of your examples; and that's just gaming.
 
Last edited:
I've read the first common industrial use of mechanical computers was for the textile mills in Britain and France as early as the 13th century, counting threads for early weaving machines, and color designs, following patterns for mechanical embroidering. We've come a long way.

And yes, industrial (analog) computers go way back to the early 1800s.

Much of the early Digital Computer technology borrowed from older technologies because the systems already existed. The cards that the early Jacquard looms used had the patterns encoded onto cards, originally metal and wood, then later paper. These were purely analog systems, not unlike what would be used in player pianos and music boxes. The patterns were put onto cards and this was repeated over and over again. Making a new card with holes in different places created a new pattern.

Jacquad%20loom%20card.jpg


But these were not "thinking" machines yet. There was really no logic in place, they simply did what they were told to do exactly.

At the end of the century, it is a variant of this card that Herman Hollerith used in the machines of his Tabulating Machine Company. This was one of 4 companies that made tabulating machines that were combined together to create International Business Machines (IBM). But all these machines did at this point was use analog mechanics to add things together. You could use a stack of cards, each with punches in a different location to indicate quantities or amounts, and the "computer" would go through them and add or subtract the right amounts from the right registers and give you the results. By 1933 this evolved into the 80 column alphanumeric keypunch cards that most over 40 are familiar with.

13e3a7eced717c6151fdf7aab072faf7.jpg


And after WWII when computers went digital, this remained the main way to enter software and data into computers. But as technology progressed the old metal "feelers" gave way to light as the way they were read. Paper tape using a similar technique was also used, but more commonly on mini and microcomputers than on mainframes. But by the early 1980's this technology was rapidly giving ground to video display terminals. And the keypunch card was largely dead by 1985.

I think the last time I actually saw keypunch cards in actual use was in the early 1990's. Some utility companies still used them for billing into the middle 1990's, but I have not seen one like that in 25 years. I imagine the last of them was retired before Y2K.
 
I have been generally avoiding the other systems, since few here would have ever used them.

Yes, I used LISA, as well as the original Mac. One of my first professional computer jobs was as a Mac tech on the PowerPC 7200 using OS 7.

And yes, there was AppleTalk, if you could afford to have 2 Macs. Not many people could afford that kind of expense unless they used them professionally.

But in those old systems, you actually had a fair amount of specialization. Most today are familiar with the idea of using a computer for just about everything. But back then, that was really not the case.

If you used an IBM, odds are you were doing some kind of professional application like spreadsheets, databases, or word processing.

If you used an Apple, odds are you were in either graphics or desktop publishing. That was one of it's strongest abilities.

If you used an Atari ST, you were probably using it for music. A great many professional musicians were using the ST for composing, as it probably had the best MIDI capability of any computer of the era. In fact, it's capabilities were so strong that some clever hackers figured out how to network 2 or more ST machines through their MIDI ports. This led to the creation of one of the first real time multi player games, "MIDI MAZE". Amazingly crude and simplistic today, this game was mind blowing in 1987. My first experience with it was in 1988, playing with 5 other players in a simultaneous deathmatch.



And if you were using an Amiga, odds are you were using it for video. The Video Toaster is what really made the Amiga shine in this area. For the first time it allowed somebody to use a single inexpensive computer to do advanced video animation and editing. Chroma key ("Greenscreen"), switching between different feeds with various effects, advanced character generation with animation, even image manipulation. It was so revolutionary that the creators won an Emmy Award for technical excellence in 1993.

And although the Amiga and Video Toaster are long gone, the software actually lives on to this day. Today people in the video area know it as LightWave 3D, and has been used in movies ranging from 007 GoldenEye and Avatar and Star Wars: The Force Awakens to CSI, Firefly, and Stargate SG-1.

I am not ignoring these older systems for any reason other than most here have probably never seen them (if they have even heard of them). And I also consider the "Classic" Mac series as a distinctly different computer from those that came out after OSX. To be honest, I really consider the "Macintosh" computer to have died in 2001 when OSX came out. Today, I tend to place all of the Macs that followed the return of Jobs as successors to the NeXT computer. They really are not Macintosh computers.


ALL of that happened WAY before 2009.<-period
 
Whatever dude! Ten years ago was 2009, things haven't changed that much.<-Period

IMHO, you have to go to 1999 and before to see real differences. Other than speed and graphics, it was all there in 2009. I worked in the business (network engineer) and retired, recently. Your gaming examples go back way more than 2009. I played MUDs, UO was the first, really successful, GRAPHICAL MUD, that goes back to 1995 and it was much more advanced than any of your examples; and that's just gaming.

Uh-huh.

Then I guess you either never used forums much back then or forgot how crude they are compared to today.

Go and visit 4chan. That has barely changed since it started in 2003 and it shows. In fact, it is a rather interesting glimpse into what a forum was like in 2003. And this was pretty much the norm until around 2006.

hbczY.jpg


Essentially just HTML pages, where new entries are appended to the page. Limited if any quoting abilities. A very crude editing capability. Some did not even allow editing once a message had been responded to. I ran several forums in the early and mid 2000's, and they look nothing like what we have today.

And yes, much of the history I have been writing goes way beyond 2009. Where did I ever say I was not?

And yes, a lot of times the technology is "already there", it simply had not been developed or implemented yet.

Not unlike the difference between line numbers and labels. The capability to use labels has been in place for decades, yet until the 1990's line numbers was still the norm. Hence the listing of code for say C++ or Java looks nothing like what somebody writing in COBOL or C would recognize. There is absolutely nothing that would have prevented such a programming language in the 1970s (in fact some did exist), but it was simply not how it was done at that time.

If you just retired, then maybe you should spend some more time remembering what things were like when you started. They are certainly not like they are today.

Or did you always use RJ45, and never work with BNC connectors and terminators or token ring DB9 cables?
 
ALL of that happened WAY before 2009.<-period

And once again, when did I ever say I was only discussing the last 10 years?

Considering I started this going back over 40 years, I find this obsession of yours to only deal with things since 2009 to be rather strange.
 
Red:
The quantity of users has no bearing on the central element of your thesis, which is that of the existence of a given technology. The number of uses reflects the adoption -- status and rate -- of that technology, not whether and when the tech in question came into existence.

I do believe the thread title is the "history of computers," not the "history of the general public's adoption of computing software." As go "laymen's" adoption of computing technology (hardware and software), I think one can make a strong case that the development of GUIs effected and hastened that progression/outcome. That said, your thesis has to do with the history of computers, not the history of GUIs.

And you are completely missing that I have been discussing computers in how they related to everybody. Not just the few individuals like myself who actually used them at the time.

Once again, all you do is come in and blast away, not understanding what is actually being discussed, and trying to take things into a completely different direction.

Do not like it or the title, then make your own thread and call it whatever you like.


Red:
The thesis from your OP:
Over the last few weeks, I have had several interactions with people who really have no concept of the history of computers, and think that everything that we have today is what we have always had. And they really have no concept how "new" all of what many of us use on a daily basis (including forums like this) simply did not exist a decade or so ago.

I didn't at all miss the central assertion of your OP, which is that computers and computing software that we now "use on a daily basis" simply did not exist a "decade or so ago." It existed then and well before.


You started your OP with an assertion about existence as the basis for your ridicule of folks' notions about computing history. For your case to have merit, your existence assertion must hold true, and it simply does not.
 
And if you were using an Amiga, odds are you were using it for video. The Video Toaster is what really made the Amiga shine in this area. For the first time it allowed somebody to use a single inexpensive computer to do advanced video animation and editing. Chroma key ("Greenscreen"), switching between different feeds with various effects, advanced character generation with animation, even image manipulation. It was so revolutionary that the creators won an Emmy Award for technical excellence in 1993.

At my peak I actually had THREE NewTek Toasters, one was Amiga 2000 based and two were 3000 based.
One of the 3000 based Toasters was used exclusively on the road, one exclusively in the BetaCam-Type C edit bay, and the other was used for a smaller Umatic edit bay.

I donated the old Amiga 2000 based unit to a church in Mansfield, TX in 2007 and to my knowledge, it is still chugging away today.
 
And once again, when did I ever say I was only discussing the last 10 years?

Considering I started this going back over 40 years, I find this obsession of yours to only deal with things since 2009 to be rather strange.

Hey in your OP you said:

they really have no concept how "new" all of what many of us use on a daily basis (including forums like this) simply did not exist a decade or so ago.

I'm calling BULL**** on that. NOW, IF you would have said, EVEN 20 years ago, I would have let it slide. But you're comparing today with things that happened 30 years ago and saying WOW! Yeah WOW 30 years ago, wow 10 years ago … not so much.

When I was in college, about 1993 (I was a retread) a bunch of us Computer Tech's hijacked a computer lab one weekend so we could play network DOOM . (NETWORK DOOM, come on that was almost unthinkable in those days) It took us (some of the best minds on campus) about 3 … 4 hours to rework the network, we played Doom about 3 hours, and then it took, a few of us, another 2 maybe 3 hours to put the network back to normal so class would happen on Monday. Things we do today, in a few clicks, were very hard back then, by 2009 … not really.
 
Last edited:
I'm a year older. I remember being offered the chance to study a new subject at school called "computer science" but my father banned me from doing this qualification.

"waste of time" he said, "nobody will ever earn a living working with computers" he said....

He was never very good with financial predictions and suchlike.

Red and off-topic:
...And thus is palpably illustrated due cause for, upon reaching majority if not sooner, one to reevaluate one's worldview, notions, philosophies, etc. garnered from one's parents. To be sure, not all that parents pass to their kids is awry, but enough is that, even while respecting one's folks and acknowledging the viability of their approaches in their time, one may often enough be well advised to disabuse oneself of certain doctrines they embraced, particularly those pertaining to social/cultural and professional mores and paradigms.​
 
That's NOT to say what's happened in the last 10 years with hardware, software and the "cloud" aren't amazing! AbsaByGodLutely they ARE and in the overall scheme of things … FAR outstrip ANYTHING … ANY THING that's come before. BUT the average consumer doesn't really see it. Which is EXACTLY as it was designed to be.
 
If you used an IBM, odds are you were doing some kind of professional application like spreadsheets, databases, or word processing.

If you used an Apple, odds are you were in either graphics or desktop publishing. That was one of it's strongest abilities.

I am not ignoring these older systems for any reason other than most here have probably never seen them (if they have even heard of them). And I also consider the "Classic" Mac series as a distinctly different computer from those that came out after OSX. To be honest, I really consider the "Macintosh" computer to have died in 2001 when OSX came out. Today, I tend to place all of the Macs that followed the return of Jobs as successors to the NeXT computer. They really are not Macintosh computers.

It was a myth about who used what for what. There was never anything that could be done on a Wintel machine that couldn't be done on a Mac. Macs were de rigeur for law firms because of the macros in WordPerfect for the Mac, that ended when Norton bought WP and killed it. Then the courts went Wintel thanks to MS marketing. It was no secret that independent consultants hated macs because users didn't need them, and they became a marketing arm for Wintel to maintain job security. :) We saw macs used in retail stores, jewelry displays, design studios, for report generation, and database construction, healthcare, insurers, actuaries, accounting, whatever. SilverSurfer the first relational database not for heavy iron was Mac only for years, even after Marvel sued over the name, and it was change to 4th Dimension. It left other portable database languages in the dust and was only ported to Winter because its developers wanted entry into that market. Today 4D is the front end of choice for both Oracle and SAP. Quickbooks was originally developed with 4D. Music on the mac developed despite Apple. Studio engineers replaced front end boards with Macs, using programs they developed without help from Apple. Mostly thanks to Danny Goodman's Hypercard, the grandfather of HTML. The first commercially successful spreadsheets and CAD programs were developed for macs. MS stole ideas and programming from everyone, out marketed and crushed competitors. I found security code from applications I wrote in Office, trademarked and copyrighted, forced MS to pay me royalties, only succeeding because they learned I was crazy and physically dangerous. It didn't matter to me that they screwed up the deployment of my code. :) Apple was safe, because MS was making almost as much money from mac deployments as it was from Wintel machines that paid bottom dollar. MS is still the top dog for Apple programming and sales. Linux stole some thunder, but not for those with jobs to do who didn't want to be bothered tinkering under the hood. UNIX is still the mission critical platform for the defense industry and similar high end tasking that demands real security. Good programming engineers have no problems restructuring UNIX platforms uniquely for self protection.

BTW, ironically, the only portable database languages still in major use are 4D, Apple's Filemaker and NoSQL. All three having transmogrified into interpreted programming languages with unlimited internal graphical interfaces. The rest have fallen by the wayside. 4D front ends are in use by every major bank and insurer. All three major stock exchanges use management tools developed in 4D. The list is endless. More so in Europe, since 4D is a French company.

The one thing users demanded of OSX, was the user interface of the classic MacOs. That basically hasn't changed and it is the soul of the Mac computing experience. The major difference is access to the terminal for those heavy users more interested in a UNIX experience. Crazy old men like me. :)
 
That's NOT to say what's happened in the last 10 years with hardware, software and the "cloud" aren't amazing! AbsaByGodLutely they ARE and in the overall scheme of things … FAR outstrip ANYTHING … ANY THING that's come before. BUT the average consumer doesn't really see it. Which is EXACTLY as it was designed to be.

My career in film-video was interrupted by the Northridge Quake of 1994.
My computers were NewTek Video Toasters powered by Amigas, and the only other computing devices I had were an Ampex ADO (Ampex Digital Optics) special effects unit and a Unix powered edit controller for the videotape machines. (VTR's)

I had to go to fallback work in Electronic Engineering and IT and only got back into video in 2002...part time.
It took me those eight years to recover from losing 350 thousand dollars worth of gear in the quake.
Blessing and a curse at the same time, the loss was horrific but I was spared the agony of going through the shakeup from analog to digital, so by the time I got back in, everything was definitely digital all the way.

The first nonlinear workstation I had was a Pentium III powered machine in 2003 with 1 GB of RAM that took 13 hours to render an eleven minute project that I shot on standard definition DV tape (the mini-DV digital cassette format popularized by cameras like the Panasonic DVX-100, a standard def DV camcorder with some popular film type technology)
Two tracks of standard def video, three audio tracks, minimal digital effects and minimal color correction and basic lower third graphics.

Dallas Team - Wheelchair Games Long Beach CA 2003 on Vimeo

The last nonlinear edit project I did was about a year ago, a 60 minute corporate project that had six HD video tracks, heavy color correction, lower thirds all the way through, extensive effects switching moves, seven audio tracks and was rendered at about 120% real time.
Sixteen cores, 32GB RAM, Cuda rendering on GPU.

Yeah, I'd say that hardware and software have come a very long way.
 
The first commercially successful spreadsheets and CAD programs were developed for macs. MS stole ideas and programming from everyone, out marketed and crushed competitors.

Actually, you are way off here.

The first successful spreadsheet was VisiCalc, and it was the "Killer App" that helped the Apple II enter the mainstream for business use in 1979. Over 5 years before the Mac came out (and over 3 years before the Lisa). And for the first year it was an "Apple Only" program, before it was finally ported to other systems like the IBM, TRS-80, and others. And by then it had competition. Most notably from Lotus 1-2-3.

The same with CAD. HP for decades before the MAC came out made a name for itself in the Engineering industry for it's CAD workstations (Unigrafix). Oh yes, these machines did a great many things also (and ran a variant of UNIX as their OS), but in the engineering world they were seen as CAD stations. In the late 1990's I even worked for an architectural company that still had 2 of them in the back office. They were no longer used, but every month or so either the owner or the "old guy" (who had been working as an engineer since the 1960's) would fire it up and print off some of the old work they had done on it, often for a building they had made back in the day that needed remodeling.

Then then took that print-off and handed it off to me to be scanned and updated into AutoCAD.

But the first "big" CAD software was CATIA, which like Unigrafix predates the Mac by almost a decade.

As far as Microsoft "stealing" ideas, everybody did that. Do you think Apple invented the GUI interface? Heck no, they got the idea from the same place Microsoft did. From Xerox PARC.

VisiCalc did not even invent the electronic spreadsheet, they simply streamlined it. And Lotus 1-2-3 did it even better.

And for Office, even that is not new. Most do not remember that in those early days, there were very few "Office Suites", and they generally sucked.

Generally you used either WordPerfect or WordStar for documents, dBase for database work, and Lotus 1-2-3 for spreadsheets. And this was very expensive, generally costing over $1,000 for all 3 of these programs. Then you had companies like Enable, which made one of the first "All-In-One" office programs. And oh yes, I remember this piece of trash all to well.

In the early days of desktop computers in the Military, we all piggybacked on an Air Force contract of the mid-1980's. It was a Zenith 286, that came with an EGA monitor, an Alps P2000 printer, and a software suite that included MS-DOS 3.3, Windows 1.03R, and WordStar. And from around 1984 until around 1990 this was the system most in the military used. But in around 1990 the Marine Corps decided that they were going to cut costs, and got a site license for the Enable Suite, and decided that would be the "Marine Standard" for word processing, spreadsheets, and the rest.

And needless to say, it was largely ignored. Yes, it did word processing, spreadsheets, and databases. But the software all pretty much sucked the big one. So we just agreed and sent memos to confirm our compliance to the new order, and continued to use WordStar and Lotus. And things stayed that way pretty much until most companies migrated to Windows. That move, the fact that Microsoft already had a decent office suite for the platform, and that nobody else had an office suite worth a damn for Windows at the time pretty much crushed the competition.

One thing that MS was always good at, was in predicting trends. In the early days of the "GUI Wars", you generally had a GUI that had 1 or maybe 2 "killer apps", and really nothing else worth bothering with. They had very obtuse and hard to work with requirements (often including backwards licensing fees) for anybody else that wanted to work with them, and by and large they died. MS on the other hand pretty much gave away their development kits. And they did not mandate any kind of fees or certification process for somebody writing for their platform, only if somebody wanted the "MS Certified" logo (which was not very expensive, was a 1 time fee, and entirely optional).
 
Microsoft also pretty much sold to anybody and everybody, where as many of the competitors who had their own competing software only cared about their users. Tandy actually had a very good GUI, but they kept it entirely in-house so it was only used by Tandy users. And Microsoft had been doing that since the beginning. Instead of entering into selling to only a single company they licensed their software to anybody that would pay for it. Even before that agreement with MS-DOS, they had done it with Microsoft BASIC.

This is one of the things that really made them wealthy. Much like when George Lucas was given the merchandising rights to "Star Wars", IBM gave the right to sell MS-DOS to anybody else. Like 20th Century Fox, they saw no money in selling the OS to other companies. Fox thought all the money was in making and distributing the movies, and IBM thought all the money was in making and selling the computers. Neither company realized what they had given away until years later.

But most of the "8 bit era" of home computers ran on MS BASIC. The Apple II, Commodore, TRS-80, T/I-99, Atari, these computers and many others all ran on Microsoft BASIC. This is why the BASIC included in all Apple II computers is known as "Applesoft BASIC". The name was a portmanteau of "Apple" and "Microsoft", the company that sold it combined with the company that made it. And it was an advantage to each when it came to cost. Microsoft had already created BASIC for the Altair. They simply changed the code as needed for each platform, and the companies got a complete BASIC and OS for the fraction of the cost of developing their own.

But ultimately MS simply made better products for the most part. Yes, WordPerfect and WordStar were great programs. But they were not "Office Suites", which is what the industry was demanding by the late 1980's (and they were still DOS only). Starburst (the WordStar office suite) died during production and never came out (although the word processing aspect was released as WordStar 2000). WordPerfect's finally combined a Windows version with Quattro Pro and Borland Paradox in 1993, but it hit the market 3 years after the Microsoft Office for Windows did. And by then it was simply to late, they had lost their market share and never recovered.

If anything is proven in the computer industry, it is that you must adapt or die. You must also be able to predict future trends or die. The GUI revolution was coming, this was obvious by 1979 when many companies were looking at the Xerox Alto. Apple was a computer maker, so they approached their own version as a hardware issue. Since Microsoft made software, they looked at it as a software issue.

Then you have other companies, like Berkeley Softworks. They had tried to create an airline seat back computer, a precursor to the systems that we often see today. But they were trying to do it in the early 1980's with a product called the "Sky Tray". They created an OS for this system, but it died while in development. So they simply took their OS, dusted it off, then managed to get it to work with the Commodore 64 and GEOS was born. Which was later ported to the Apple II and later IBM.

And interestingly enough, survived for many years afterwards. The Nokia Communicator (early smart phone) used GEOS as their OS, as did PDAs from companies like HP, Brother, Casio, and others. It was also the GUI that everybody used if they ran AOL on a DOS platform. Some hackers even discovered how to hack the AOL floppy disk, and inside was a fully working install of GEOS.

Amiga, Atari ST, Macintosh, even IBM O/S2 and others were all GUI interfaces, all lifted from an idea that came from Xerox. But for some reason it is mostly Apple that claimed that Windows was stolen from them. Even though even Jobs later admitted that both he and Bill Gates were both in attendance of the Xerox symposium that inspired both of them to make their own version. And yes, even companies like Digital Research with GEM and Jack Tramiel with the Amiga freely admitted that they had gotten their ideas from Xerox.

Only Apple was vein enough to try to claim it was their own idea and theirs alone. That is why the lawsuit between the two companies resulted in a major Apple loss (the only technicality being that MS "stole" the trash can icon). And although that lawsuit was settled finally in 1994, most of us knew it was effectively over by 1989 because of other previous lawsuits. First were the many suits over "look and feel" (such as VisiCalc and Lotus), and ultimately the 1989 Xerox Vs. Apple. In that one (which started after the Apple Vs. Microsoft suit) it was determined that the concept of a GUI could not be copyrighted. Which meant that the next 5 years would ultimately go to Microsoft.
 
Last edited:
Back
Top Bottom