Thursday, November 17, 2011

#7 - Improving Lithium-Ion Batteries

For those that use a Mac, you probably have to charge the laptop every six-seven hours.  But what if you only had to charge it once every three days?  In fact, what if all battery operated products lasted ten times longer than they do today?  A team at Northwestern University has found such a solution to improving the way lithium-ion batteries store and use energy.  By using an atom-thick layer of silicone grapheme to encapsulate the lithium, as opposed to carbon, the research team has found that they can at not only improve the capacity but also the charge speed of today's lithium-ion batteries. The details of the invention can be found on the Northwestern University website here.

Now the implications of this invention will not “put a dent in the universe,” but it is still pretty significant. All of our mobile products are limited by the charge with which they can maintain. Because of this, most cell phones can’t hold a charge longer than two days and tablets and laptops more than ten and seven hours, respectively.  This means that we must always be close to a power supply or risk running out of power for our electronic devices. The NWU invention allows us to be less dependent on a local power supply which provides users with greater freedom and pushes the boundaries of smaller battery designs and more remote utilization of devices.

The future certainly holds more in regard to better batteries, but for now, it gets a little bit better.

Sunday, November 6, 2011

#6 - Innovation for Business Nerds

Its interesting how arguably the most innovative company in the world not only creates innovative products, but is even innovative in its business operations.  The company is Apple and this past week, Businessweek did an article on how innovative (and ruthless) Apple's supply chain management strategy is in comparison to its competitors.

I won't get into the details of the article which you can read, but I did want to discuss some high-level concepts that came to mind after reading the article.  The first is that if a company wants to create innovative products, then their business operations need to be innovative and unique as well.  It's interesting how we're taught to think within a confined framework for operating a business, but that only works if you're selling plain-vanilla market saturated commodity products like bread or soda.  In Apple's case, their market was and still is saturated with competitors. Rather than just producing an innovative product and competing with competitors with an antiquated supply chain model, Apple sought out new ways to improve product quality, delivery-to-market, and even total supply chain control. Perhaps their greatest strategic move is the cash reserve that Apple leverages to obtain exclusive and cheaper manufacturing costs. Most companies place a small reserve with a manufacturer and pay the full amount later, meaning that manufactures bear the risk of not being paid back should the product and company fold later. In Apple's case, the manufacturing cost is paid upfront before the product is even produced; who does't like getting paid first. In doing this, Apple is also able to get preference from manufacturers, who will put other competitor products on hold as going with Apple (cash now) is more secure to them. Pretty smart…..

The second concept revolves around how Apple works heavily with manufactures and material scientists to produce products and technologies as they desire (e.g., new lasers to etch invisible pinholes). Long story short, if doesn't exist today, Apple and its manufacturers will create the technology to produce it. 

The innovative products that Apple has produced over the last decade has changed the technology industry and the way we use technology in our lives. However, it was the innovative practices in regard to its business operations that enabled Apple to create these products and make them the company that they are today.  

Innovation isn't just for the technical nerds, its for the business nerds too. 

Saturday, November 5, 2011

#5 - The Big Data Problem…. haven't we been here before?

This post is not so much about a particular technology or invention, but more about a concept and problem. A concept and problem that we have been trying to solve for the past 5,000 years. Enjoy.

I hear a lot about big data. I'm sure everyone has. My customers bring it up in meetings, co-workers and classmates have discussions about the topic, and soon the news will cover it as the next biggest threat to society. IBM even had a commercial on the topic two years ago during prime-time television hours. So when one of my favorite magazines, Popular Science did a weeklong series on the topic, I thought.....maybe it’s time I investigated the subject. After reading all of the articles for the week I began to realize a couple of things:
  1. Data management has always been an issue
  2. Our current data issues are not as bad as we think; the sky is not falling
  3. We will always have a big data problem (unless we reach the maximum – yes there is a maximum)
I say data problems have always been an issue because if you look at the history of library science, you will realize that our ancestors faced and tackled similar problems. One of the articles from Popular Science gave an excellent timeline of the various data management tools and techniques that humans used to address the problems of their day.  Whether it was the creation of written language to capture ideas, the Dewey Decimal System to manage the searchability of large public libraries during the library explosion of the late 19th century, or the creation of a species classification system to manage the naming structure and lineage of organic life, there has always been a need and system for data management.

So when we think of today and look at the rise and capturing of massive amounts of computer generated information, we're faced with the same problem as our ancestors; how do we organize this material for easier consumption? Currently, there are many ways to do it and the one that has the most traction at this moment is actually over 2,000 years old; data sharing. The concept is simple; rather than recreate a new system or structure to house all the information, why not just combine the information through a shared network.  Several great examples of data sharing these days which have addressed the big data problem for their respective industries/fields are listed below: 
·     The MD:Pro
·     WorldCat
In looking at other methods which are currently being utilized to solve the big data “problem,” it appears that the unstructured approach for data management is also proving to be promising. Because we’re generating so much data today, maybe it’s now become impossible (and not worth our time?) for us to classify every bit of information and data. I know this point will probably go against the thinking of data architects and data minded individuals, but the trend appears to be shifting away from the traditional data structured environments whose genesis was in the 70’s and 80’s.  
In reviewing the various technologies and trends for data management, one should not forget that there really isn’t a “problem” or “crisis” in data management as you may have been led to believe. If you really want to know more about the big data problem, look deeper into the timing of the IBM commercial announcing the “problem”; you’ll notice that it coincided with the release of their big data product called InfoSphere…..interesting.
Lastly, we need not worry or fret about the big data problem because as history has taught us; data management is an iterative process and one which never stops….. or does it? An interesting question and answer: When will we ever catch up to our big data problem?  When we run out of space. Yes, there is a maximum amount of data space in the universe and it’s estimated to be 10^90 bits. If you’re interested why, then I would suggest looking down the data rabbit hole of information theory. Be careful, there’s a lot of data to process.  

Friday, October 21, 2011

#4 - Creating Humans

I know I told others within the program that I would stick to IT related innovations, but I've secretly been harboring a deep fascination with and researching the concept of manufacturing humans.  Sounds crazy?  I think it is.  But the more I looked into the idea, the more I realized that it's actually feasible today. Now, I'm not a biologist, geneticist, or bioengineer so I won't get into the nitty-gritty details of the following innovations, but if you look at them as a whole, I hope you realize as I have, that we've already designed/created everything we need to build a living, breathing, and thinking production line human being. 

It all began with a discussion regarding an innovation from the first round of the course blogs. Someone in the class presented the concept of 3D organ printing. Though we were all amazed by the technology, I began to wonder "why stop there?" The scene from "The Fifth Element"where the "Divine One" is recreated from the DNA of an existing cell began running through my mind; if you haven't seen the movie, here's the scene that I'm referring to (caution: minor nudity): 



In doing research, I've found that there's actually amazing similarities between creating IT products and humans (this article is starting to sound creepy).  For example, humans like computers run on hardware.  These physical components provide mechanical and electrical signaling that we both use and need in order to transport signals and process the data and actions that we want to move, see, and hear.  For humans, this is already being created with technologies and innovations such as 3D organ printing (ears, liver, heart, etc.), skeletal (bone) replication, skin generation from existing cells, and the recent amazing work that has been done in creating muscular tissue in a lab.   

For the most part, the creation of the physical (hardware) components of the human body has been the easiest to manufacture; no offense to bioengineers.  The bigger hurdle has been on the "software" side of humans; improving the DNA code. Now I should clarify that we've "technically" had the genetic code of humans for the past eight years after the Human Genome Project completed their 13 year project to map the DNA of humans.  The issue has been that for the past eight years, it's taken us a while to understand and decipher the elements of the code. Today, science is at the point where we not only understand DNA gene sequencing (i.e., the combination of amino acids that makeup eye color, height, male pattern baldness, etc.), but we're actually improving and modifying the genes of living organisms. In fact, gene "design" or "coding" is now done on computers much like how we code software; we are designing the operating system of life on computers. Another element in this puzzle is that a new innovation that self-replicates and changes the structural composition of DNA (much like the complex DNA structure of the "Divine One" in the movie) means that we can add even more genes and "reengineer" the human body to be in theory "better" than currently/naturally possible. We are creating humans for a lack of better words "….perfect."  

Now I don't know all of the implications and legal restrictions in manufacturing humans, but the point that I wanted to make is that the technology already exists today. The current patent laws regarding bioengineering means that these technologies are "owned" by corporations, so it will take the consolidation and/or partnerships between companies before a human can be manufactured.  This raises the highly sensitive debate of "when does life begin," but I ask, if we've already created all of the components necessary for life, whats the next step? It appears that we're heading in that direction already, but nobody seems to be putting the pieces together or looking at the long term implications. When will today's "Divine One" be created?  Why not tomorrow? 

Friday, October 7, 2011

#3 - Five-Nanometer Silicon Oxide Switches

Okay - I chose this new technology for two reasons. First, my inner geek just fell in love with the concept of utilizing nanotechnology for advancing material science and manufacturing, and second, because I believe that manufacturing innovations don't get as much credit as end user centric innovations such as software. With that said, this posting will discuss the expansion of semiconductor data bandwidth as a result of advancements in silicon oxide switching (transistors).

I assume that most of us within the IT industry are familiar with Moore's law. If not, Moore's law states that ‘the number of transistors that can be placed on an integrated circuit doubles every 24 months.” That means chips get more complex, faster, and smaller every two years. This law has been true within the semiconductor industry for the past 40 years and up to now, has hit a roadblock. The reason for this roadblock is that there have not been any recent innovations in semiconductor manufacturing to allow the addition of more and smaller transistors within a single processor. You might not have noticed the roadblock, but if you look at processors in computers today, you’ll notice that the trend for companies like Intel is to stack processors with multiple cores (processors within processors).  The reason for this is that manufacturing technologies cannot create a processor core smaller than today’s core, so semiconductor companies are now stacking (in a three-dimensional stack) their smallest processors as a workaround. 

However, since this blog is about innovation, there is an invention out of Rice University that not only allows processors to be created smaller, but will actually improve the fastest production processor data bandwidth (memory capacity) from 32GB to 1TB. This completely goes against Moore’s law as this is more than a doubling of transistors per processor; it’s an increase of 3,025%!!!!  Jun Yao is the credited inventor of the tiny five nanometer wide silicon oxide switch, which is conceptualized in the picture below.


The Silicon Oxide Chip Jun Yao/Rice University
Jun claims his invention came about by looking where the semiconductor industry was not. In fact, the solution was in front of everyone’s eyes. For the past 20 years, the industry has been focusing on reducing the physical strands of graphite transistors. Jun noticed that processors are already made of silicone, which also conducts electricity well, ….. so why not make a switch out of silicon?  (One up for thinking outside the box!) In any case, the trials proved successful beyond Jun’s expectations and now the industry is flooding Rice University and Jun’s team with grant money to continue their research. The current chip that is being tested has already exceeded 1TB of data bandwidth. 

So what does this mean for us users?  I don’t know. One thing that comes to mind is the impact of the computer manufacturing industry in the late 1990’s; there was a small wave of technology improvements in manufacturing that created more powerful processors, servers, etc. This explosion of powerful infrastructure components allowed now popular technologies such as virtualization to take hold and revolutionize the IT industry. Because an invention such as Jun’s is at the front of any technology wave, the impact cannot really be predicted and only truly understood by looking back years later. One can only speculate, but perhaps with more powerful processors, data centers will become even smaller, and cloud computing will become cheaper and more feasible with smaller and MUCH more powerful processing capabilities. Thin clients anyone?

Friday, September 30, 2011

#2 - Listening Platforms

For this week, I would like to discuss a new technology category that is changing the way businesses manage and monitor their market and brands.  The technology category is called “listening platforms” or also “customer analytics.”  With the explosion of blogs, forums, and other internet feedback technologies, companies and marketers now have a wealth of data with which to assess their market and brand strategies.  This however also presents a problem; there is too much data.  Because of the semantics and idioms of human language, initial listening platforms proved inadequate and incapable of deriving true meanings in large data sets of customer feedback.  However, over the past decade, refinements have been made to listening platforms through expansion and innovations in computational linguistics.  The end result has been a suite of products that today are widely used by major companies and marketing firms in the e-commerce marketplace.  
As of today, the major trend has been the consolidation of smaller listening platforms into larger business intelligence products. Such an example was the acquisition of Talkback by SaaS and other similar acquisitions by Oracle, IBM, and other large BI companies to expand their service offerings.  The market though is still young given the vast number of smaller listening platform products that are still in existence and heavily used.  A good list of popular listening products can be found here.  
At the root of it all, listening platform products scan customer feedback and postings on websites, social media sites (twitter, Facebook, etc.), and user communities were products are discussed.  With the current advancements in linguistical intelligence and social media networks, the sky’s the limit for listening platforms, which will certainly change with future acquisitions looming. 

Thursday, September 15, 2011

#1 - LightRadios

We all know what cell towers look like. They're usually 100 feet tall (unless they're on top of buildings) and stick out easily in any environment. But what if a cell tower could fit in the palm of your hand? A company called Alcatel-Lucent has created such a thing and they believe it can change the world of telecommunications (at least make coverage better and cost cheaper). The company has not only managed to reduce the size of cellular antennas, but they've also been able to consolidate multigenerational networks into one single antenna (2G, 3G, and 4G). This technology has created quite a buzz within the telecommunications industry and Sprint is already testing the cubes within their own network.

Now the analysis. "LightRadio" as the product is called has its advantages and disadvantages. The most obvious advantage is its size. Unlike cell towers which require lots of energy and a small dedicated power supply to operate, the LightRadio cube runs on only 1.5 watts (yes I did say 1.5 watts).  This allows them to be easily installed on lampposts, bus station coverings, and other locations previously impossible for traditional cell site technology.  In addition, the cubes can be monitored remotely and even directed to provide directional signaling to address local congestion. This makes their maintenance cost cheaper while also improving network coverage. Remember, current cellular towers broadcast in all directions, so most signals do not reach users; wasted energy equals higher costs.

The disadvantages of the product center on its limitations.  For one, a cube only has a coverage distance of 2.5 city blocks. That means that cell towers will still be used, but their numbers can be reduced to instances where long distance transmission and coverage is required.  The second disadvantage is that LightRadios must still be physically wired to cellular base stations. This can create a nightmare of wires for network engineers, but Alcatel-Lucent is currently working on integrating base station components (smaller of course) into a LightRadio cube by 2014.

All in all, the advantages of a product like LightRadio are welcomed within the telecommunications industry as a simpler and cheaper solution to address the explosion of data needs on their networks.  Sprint's immediate interest and announcement of testing just one month after product release shows that the industry is looking for smaller and cheaper ways to expand their existing networks.  Who knows, maybe you'll see a LightRadio on your way to work tomorrow? 


Hello there!

As part of my Emergining Technology course at George Washington University, I will be blogging about ten new emergining technologies throughout the semester. Each posting will focus on a different technology or innovation and it's impact to the technology industry and users as a whole. I hope you enjoy the topics and opinions shared within this blog. Comments are always welcome, so please post away.