Recently, I read an interesting article that discussed access to technology issues, systemic to the city of Detroit. You can peep it here Mich Tech News.
After perusing the article, I began to formulate an analysis of my own on the subject of the much discussed Digital Divide.
What I find interesting is that the concept of Moore's Law
is never considered when people disuss the Digital Divide. In fact, most arguments attempt to make a correlation between the cost of technology and the Digital Divide. I assert that Moore's Law debunks that argument. If you're not familiar with Moore's Law, take a look at this article.
The key enabler to Moore's Law is the decreasing cost of transistors,
brought about by vast improvements in silicon wafer manufacturing technology.
Hence, we now have 4GHz CPUs that are manufactured at nearly the
same cost of its early ancestor, the 386 CPU, almost 30yrs ago.
So, if the costs of manufacturing computer chips are dropping at
alarming rates, and subsequent costs of electronic gadgets (ie Blackberry,
cell phones, etc) are also decreasing at alarming rates..
We know that many of our black youth are huge consumers of these gadgets.
Why is it so difficult to get black children/families interested in technology and computing ?
Isn't this also a large contributor to the digital divide ?
Another good Digital Divide article was written by Ejovi Nuwere.