title title title title

News You Can Use: 2017

What's happ'n Now.

This is a news article for the site that will be posted "often" if not monthly for the purpose of keeping you up with the news of what's going on in the computing world.


John Wolf

John is a retired engineer enjoying a second childhood by tinkering with electronics.

Dateline: June 21, 2017 8-bit Computer

So later in the year, when people ask me, "What did you do this summer?", I can say, "I built a computer from scratch with chips, logic gates." Of course, I'll get blank stares. Why, would be their first question. I did it because I wanted a model to teach young folks how computers got created in the first place, what makes them work, and to illustrate the amazing progress of engineering as we know it from rooms full of relays to 'super computers' the size of a laptop. The magic realization is, it's all these same little gates that are still used, but millions of them working together verses a few hundred devices, limited only by cost, space, and the chemistry needed to make integrated circuits.

What better time to start to tell the tale than the first day of summer.

I spent a lot of time sketching out ideas of how to do this and all my efforts paled in comparison to a series of YouTube videos I stumbled across created by one Ben Eater. His design was inspired by a book called "Digital Computer Electroincs", by Albert Malvio and Jerald Brown. In this book they describe the SAP-1 or Simple-As-Possible computer. This is the very concept I wanted to pursue. So I watched Ben's videos and built a close approximation to his design. The great news is it works beautifully! Check out the projects page for details.

Once you cut up nearly a 100 feet of hookup wire into custom lengths to fit all this stuff onto 14 proto-boards you realize sticking with only 8-bits is the only practical level for this project. Like everything digital, things go up by powers of 2 to grow expoentially and would soon become too unweildly if, say you went to 16-bits or greater. What you realize is you are on the path the inventors took to get to where we are today, and that all the engineering that took place to progress this industry to make this journey feasible.

The computer, I'll call it Rex, only has RAM space for 16 instructions including space to store the variables. Rex only adds or subtracts 8-bit numbers, so is limited to integers betweetn 0 and 255. It also includes a feature to display 2s-complements numbers between -128 to 127, so you can see numbers below zero. It is apparent that this is so limiting, Rex is only good for demonstrations, but does cover all the bases to get the job done, and besides with a zillion LEDs flickering, it's fun to watch.

The whole computer centers around an 8-bit bus in between solderless proto-boards that each have a basic function plugged onto them. Some overlap onto other adjacent boards has to be done here and there, but for the most part, this project is moduler in scope as Mr. Eater's videos so skillfully demonstrate. So what does it take to pull together a computer?

Digital circuits are based on timing. If it ain't got that beat, everything is meaningless. Since all the intellegence that a computer has is how we interpret when certain bits are either switched on or off, we have to start with a clock for the design. It might seem precocious to say all this computer power is based on a collection of switches, but you should know by now, if you have been following the digital logic series of discussions on this website that yes indeed, that's precisely what is going on. That's why the binary numbering system is so important to all this. It matches the need to model an electromechanical system with two states - on or off. We abstract this by saying a logic true of false or voltage values of 0 volts or 5 volts is reflected in the computer by switches being either on or off. So our ability to represent the mechanical device is only as good as out mathematically model and binary numbers fill the gap nicely with a few tricks along the way. Since binary numbers only have the digits 0 and 1, we map this to our concept of on or off.

Our clock segment is more than just an ocsillator buzzing away. We need to be able to turn it on when needed, maybe more importantly, turn it off. And we need to step through actions of the computer to evaluate if the thing is functioning correctly. When everything is working right, we can let the clock ripple through instructions and come to a halt when the operation is complete.

Another major consideration is memory. We need a bucket to put our bits in that the computer can reach and store our program actions and the data to be operated on. In the Rex computer, I used SRAM. Since nearly all the design is built from the 74LS TTL series of chips, the RAM used is two 74LS189 chips, each only 4-bits wide, so we need two. Also, these chips only have room for 16 4-bit words. All together we only have 16 8-bit words, but that matches the bus size, which is another basic lesson learned. The buses used in computers are parallel wires moving words around the word size of the memory structure. Can we do anything with just 16 memory locations? Yep, because all we are doing is adding/subtraction two numbers and displaying the answer. We also need to load instructions into some sort of register to read their opcode to see what instruction is being performed (another module), and we need to place data from this same memory space into a register for each operand, so we need an A register and a B register, both set up to dump their values into an adder. The 74LS283 is a 4-bit full adder, so two of them are teamed up to stay in compliance with our 8-bit system architechure. If we are clever, we can combined needed substeps to move things around on the bus to completely run a particular instruction and that's where microcode comes in. All instructions have to be fetched from memory, so for a given location in memory, we have to pull the instruction and move it to the instruction register to be digested and we need to increment the pointer to the next instruction. If we compress all the steps needed, we can do the fetch in two steps. But we haven't done anything with the instruction yet. Let's say we are loading the A register with a value in an instruction called LDA for load A. After the fetch, we have to turn on the bus so the A value in memory can be seen by the A register. The key thing to realize at this point is all these chips require a clock cycle to operate. This has to be the case to keep all the functions separate until called upon by some enable signal. When we get to defining all our instructions, we'll find it takes up to 5 steps to complete any particular one. So how to we deal with that?

First we have to have a counter that steps through the memory to pull each instructions. That's a module call the program counter, but we need to hold it off until 5 more little steps can be done to complete the instruction. This is done by a separate counter (module) and this little operation is call microcode. It's a self-contained little set of steps where we switch on and off the required control lines that provide the enabling logic to the chips needed for the completion of the instruction and complete the microcode cycle. Once we get through these 5 steps, we let the program counter proceed to fetch the next instruction. We do this until we hit a halt instruction which basically stops the clock.

We are almost done. We need a place to store our answer after we add or subtract. In Rex, we place the answer back into the A register. These registers have the ability to read from and to the bus, so this is as good a place as any. We want to provide this answer to the output or display module, so we have a step in the OUT instruction to turn on the A register to the bus and turn on the display module to read from the bus. Once the answer is on the display, all we have left is to stop the clock and read the answer.

There is one more very important module and that is the control logic module. There is a series of diagrams in the Project Page for this project that show the functional areas or modules and the central bus they all use that shows the control signals each module needs to function. Those signals are brought together in the control logic module and are divided up by instruction type and micro-step of the microcode mentioned earlier so each step gets the signals sent to the required module to do its job. In Rex, this is done via an EEPROM. Each instruction step is an address into the EEPROM. Each read from the EEPROM is the signals needed for that step. Rex uses 15 bits to cover all the bases, so two 8-bit EEPROMs are ganged together to do this. These bits don't go into a register. They go directly to the chips that are using them, so the EEPROM outputs are routed all over the place to orchestrate the operation of the computer. In a normal computer, this would be the control bus.

So that is fundamentally what Rex does and how it is constructed. All the details are on the Projects Page. So that's what I did for the summer of 2017. If you have any comments or feedback you can contact me via the Contact Page, but you'd be better served to just watch Ben Eater's wonderful video series.

Dateline: May 8, 2017 - Del Mar Eletronics Show

Seems like I've been here before, really. The same vendors in the same spots for the last two years. In fact, Ied i could tell they'd aged, but the props and flashy buttons looked just as before. Maybe they never went home after the show...? It was fun to walk through and smell the old capacitors, but the give-aways were mostly candy. Down at the Digikey area, the guy giving away back-scratches was up to his old tricks, but I miss the days went you went to these affairs for the bag clips. None to be seen anywhere.

I'm always a sucker for a box of switches with bright LEDs in all the colors of the rainbow, but somehow the luster seemed a bit faded. No innovation, nothing new and exciting. Is electronics dead? I caught myself thinking, "I'll bet if this were in China, there would be robots throwing acrobats from one side of the hall to the other. There would be manufactoring of IC right there in front of you with tiny mechanical arms buzzing, stitching gold wires between components and lead-frames." This show was a bit of a yawn.

I came home and tossed the lanyard with my registration tucked inside onto the pile of landyard name tags from the past. Maybe next year it would be more exciting or maybe I should switch and go to the garden show.

Dateline: April 1, 2017 - Big Daddy Data

There has been a disturbance in the Force. Big Data and Big Government, not to mention your run-of-the-mill hacker, are all in collusion to watch and record your ever gesture when you use your computer. Even Congress has decided to allow ISPs to sell your browser history. Everyone wants to know where on the Internet you travel. Do these people assume if you visit a high fashion clothing site you must be gay? Yes, that's exactly the kind of muddy thinking Big Data analyst worry about. They have to make sense out of nonsense by prying into your every move on a keyboard. Why? They say they gain valuable insight to your commercial habits and can increase sales to those gullable enough to buy only the products placed before them by the online marketeers. Bull feathers.

I apologize for the cynicism. I do recognize that statistical techniques applied to sales data is useful for predicting what to manufacture or stock to optimized sales, if not factory distribution to market. But all of this is perfect in the academic setting of engineering economics 101. In the real world, humans are fickle. Well, maybe not so much in America, where fades and fashionable are worshipped. The problem I see is collecting personal data goes beyond sweater color sales for next winter to...hmm, that's interesting, I can use this information in a negative way and remain anonymous. I have insulation. The dat-tee can be taken advantage of. Humans have a long history of messing with each outer, especially if undetected or licensed by loose politicians making laws influenced through lobbying.

All of this is to legalize spying on people. That should worry everyone. What happened to the Fourth Amendment? Is that just another item circumvented by Congressional skullduggery. There is no good reason to be doing this. I'm surprised they just don't do it behind our backs. Oh, they do! Ever heard about Supercookies? The technical term is UIDH or unique identifier code. Verizon has already been slapped with a fine for not telling people they were being spied on. This cookie is added to any HTTP request after it leaves your machine and enters the ISP server. Your website visits are now Big Data and can be sold by the ISP without your knowledge even if you decide you don't want to be tracked and clear your browser history. And guess what - after the big boys got their hands slapped, they got a Congress women in this case, to legalized the act. For every wall, there are those who build a longer ladder to go over it or around it in this case. The last quote I saw was the Big Data industry is now just north of 187 billion dollars in value to those companies that invest in finding out what style of sneakers you buy. When that kind of money is being spend to find out what you ate for dinner last night, nothing can stop them from getting what they want short of stop using the Internet. They count on your addiction.

Here's my put - sue the first ISP that sells your data for at least a 50% cut or better yet, if this is so valuable, I shouldn't have to pay for Internet service. Let my browser history pay for it. It's your data. You should profit from its sale.

The problem is if companies already have in-place ways to spy on your Internet activities and what content you are seeing, then they can read what location maps you bring up, phone calls you make, where you go, because your phone tracks your location. You have no private life if you use a Smart Phone and do your business on a computer connected to the Internet. What could go wrong here? If someone had nefarious deeds in mind and lives at your ISP's house, you are screwed. Who would do such things? Stay on the Internet and you just might find out.

So what do you do? The Internet is part of our culture now. We need to stand our ground and get lawyers involved to establish laws such that when a person is screwed over by Big Data or Big Daddy, whoever, they get the pants sued off of them, if it invades your privacy. Identity theft is getting out of hand and computer data spinning around willy-nilly gets leaked to criminals. ID theft takes place every 10 seconds or so. That's the statistics on ID theft. I'd say it's already out of hand. Every few weeks another major 'leak' or hack of social security numbers are ripped off. All the major ways to identify you and your bank account are now easy for hackers to reach if they have access to your personal information. Well, that seems to be up for grabs now.

The best solution is to use privatized networks. All important data should never leave or be exposed to the general Internet. The Internet is now open to all, even when encrypted means are in place, because the 'trusted agents' at the end of the line can't be trusted anymore. They will sell you out in a heartbeat. Not much has changed in the world, really. If you have a secret, don't tell anyone. If you want to keep things private, you have to treat it as secret. The only way to not tell anyone, is to not tell anyone, I mean anyone. Cloud data is not private. Cloud data is for sale. If billions are being spent on taking just a copy of your data from you, do you think a cloud providers will not dip into the pool? Your nuts if you trust them.

Safes were invented to keep private information safe. Networks were designed to tell everyone on them everything. The rest is hand waving. Private networking among trusted partners is the only way to use computers safely. Even having a VPN channel (Virtual Private Network) will not suffice. Intranets, not the Internet. Smart phones are secret agents spying on you. Wake up and smell the oozing data.

Dateline: February 19, 2017 - Clouds-I've seen both sides now

The number one dilemma of geeks everywhere is how do I protect my hard earned data from disappearing. Back-ups are fine, but what about fire and theft? Okay, I'll send a copy to my uncle's house for safe keeping so the data isn't co-located. Bad idea on many levels, not to mention my uncle is nosey. What media do you use? CDs are passe now. No matter what you use, the device that reads it will be obsolete in a year.

Ah!, the cloud. This all started with the off-site data storage websites like Carbonite. For a monthly fee, they store your data. Now everyone that sells computers or software has a cloud service. It does solve the problem of off-site storage done in a convenient manner(as long as you don't drop the payments). So where does your precious data go exactly?

Cloud stored data goes a lot of places and that's good and bad. It ups the chances your data is safely stored and a copy can be recovered, but it also ups your data exposure to 'trusted' agents. The chances of being hacked goes up just as fast. I find the pros out weight the cons. What I do find disturbing is the bossy cloud algorithms that handle your data.

Apple's iCloud is certainly convenient, but it basically wipes your hard drive clean of your data and stores all of it in the cloud. It then lets you borrow the data back like a greedy banker that gets a hold of your money. What if the Internet goes down? No data for you. Oh, you can keep a local copy packed away in an archive folder dust bin. You can see your data on any Apple product connected to the Internet, hopefully protected by a password. That's really not a good ideal unless you only have your favorite cat pictures at stake of being seen by unauthorized eyes. There is also the problem of it taking days of upload time to get a bunch of data on the cloud in the first place.

Many online storage companies use proprietary software to manage their service. Not a good idea. Your data is held captive and their code goes out of date and you have to pay for the updates. If you decide to go elsewhere, what happens to their copy? No one knows for sure.

Here's a plan no one offers. Your data stays on your machine and is backed up locally(preferably a hard drive image). You decide what files are to be placed in cloud storage, files you don't mind sharing. You have a network connection to your own off-site server to store, not only your data, but your application software as well - a hard drive image. Not managed, but just a bloody image copy, no fancy interpretation algorithms, just the real deal. This can be as simple as you sending a trusted friend an image copy to keep for you on a USB disk drive. The cloud files are updated when you place your machine in 'sleep mode' or prior to a shutdown, not while your working. Instant cloud updating is a nuisance. It really ruins any creative computing. But what if a meteor hits your computer while your typing, you'll lose your latest inputs? That's the least of your problems. You will be in the clouds with your data floating forever.

You're probably keenly aware that data storage doesn't have a single best solution, but cloud storage is a good place to start, but don't get involved with an overbearing management system where you don't control the pieces on the board. It's still a good idea to keep a copy of your hard drive image nearby. Computers break. At least you can be back on the air with a new computer or replacement hard drive. The chances of a catastrophic disaster in your computer den is low in the first place. Theft on the other hand is always real and the stats on that are scary. Maybe you best spend a few bucks on a steel door with a cipher lock and a good security system like a large dog.

Dateline: January 22, 2017 - Mongo, Mongo

Most of the mumbo-jumbo this article talks about is not recent, but spans about a decade, sliding down the slippery slope into recent usage and well into the Eureka! stage. So what's Mongo anyway? MongoDB is a database system destined to become even a bigger deal as each new web app is created, why? It's very efficient, nearly mirrors JavaScript objects, and conveys information in a much easier format than anything in the past. Like Ethernet being cheap and easy - and open-source! And that usually means a takeover of what's 'superior' or any of the other available options.

Maybe even more significant, MongoDB is a Non-SQL or non-relational database scheme. What! Yep, no more tables, records, or fields. No more complex joined tables and query strings that take a ton of code to reach. There's a lot more complex things to say about relational database schemes, but it's not necessary to discuss, because MongoDB will probably replace most of it anyway. So how does it work? Consider the JSON format style that has now replaced XML as a way to move data around on networks. Also consider that Node.js is becoming the programming choice for developers to run server-side code. Coupled with client side apps being made dynamic with JavaScript, it's a JavaScript world, but what do we want to use for database storage? Well, something based on JavaScript to keep the code flowing more efficiently. That would exclude SQL based DBs. This is where MongoDB stepped in. Not only is it based on JSON formatting, this new idea of dropping the complex manipulation of DB tables makes it easy to sell. So how can you not rely on tables?

If you look at a JSON file, it's a fundamentally an associated array of key-value pairs between curly braces. It can also embed other arrays, and objects, so there's nothing you can't format. Now if you add an indexing system to make each JSON object available via a simple line of code, then you have data retrieval. With a parsing function, your data is returned. It's easy to store a collection of these JSON objects. In fact, that's what the scheme is, a collection of documents, where each document is a JSON-like file with your data tucked inside. Yes, you could store duplicate data in error, but you write code upfront to avoid that. You should write verification code and security filtering code prior to storing data in the first place, so in the end this approach is world class from a professional point of view.

There have been some cool additions with this "new" approach especially in the indexing area. Mongo adds an ID field that incorporates a timestamp and MAC address of the hardware storing the data as part of the normal unique indexing like you'd find in a SQL DB. This is an amazingly powerful management tool for the database. Now that the world has JSON.stringify and JSON.parse functions, MongoDB data can be converted in and out of straight forward JSON streams. The format augmentation MongoDB uses is called BSON or binary JSON formatting and is in a binary stream, so it stores in a compact form and moves fast over networks.

Like all this open-source frameworks these days, Node.js and MongoDB and the Internet are natural partners. This market, if you will, unleashes all sorts of "newer, faster, stronger" add-ons. Express was the big deal a couple of years ago to make Node.js dance on the Internet. Now we have Mongoose to be the Mongo driver for Node.js. Coupled with Mocha for testing, and a dozen other "essential" add-ons, web developers are in a heyday right now. So if you want to get onboard, check out Udemy.com and Mongo University and start learning the wave of the future if you are into writing applications for the Internet, open and online store, or general just want to take over the world.

2016 Archive

2015 Archive

2014 Archive

2013 Archive

This website opened in late 2013 in hopes of supporting those interested in electronics and a hobby and a profession.