Sunday, December 29, 2013

Supply Chain Holiday Bedlam

The holiday bedlam keeps extending way beyond the holidays. During 2013 consumers saw the continued erosion of Black Friday as retailers offered shopping opportunities not only on Thanksgiving day but a few began prior to Thanksgiving Day. Christmas inventories began to appear as early as mid-August in craft stores and late September in some some retail outlets.  A series of articles about shifting patterns were posted:

Which Companies are winning the Online Delivery Race?

Online Shopping Growth Delays Christmas Deliveries from amazon and Parcel Services

UPS, FedEx Scramble to Deliver Delayed Christmas Packages

Now that the Christmas shopping is over late arrivals from online shopping trickle in and returns spawn the reverse supply chain flow. Many consumers are upset over late arrivals of online purchases placing blame on the logistics networks. Who is really to blame for any problems the customer had in completing the purchase during the holiday season?

Holiday Crankiness a Consumer Problem?

Some experts blame the consumer for their behavior; procrastinating, ordering late, and viewing the holiday season in terms of economics and not the nationalistic or spiritual reasons for the holidays. These experts tend to view the holiday season accentuating vanity as gifting becomes outlandish and tempers run high amidst the holiday crowds as people take the seasonal bustle personally. These experts see the world shifting away from nationalistic and spiritual prides towards an economic pride in which status, worth, and esteem are driven by the ability to make purchases and gift to others. Some CEO's reject that notion and view the issue a business problem. They see the model as the consumer is always right then the issue must be within the supply chain and getting the product to the end customer.

Do Supply Chains Compete in the Temporal Domain?

The problem is not logistical routes or the ability to fill the supply chain with goods. Capacity and capability play a major role but there are limitations on the capacity and capabilities too. The problem is more temporal. The gifts did not get there on time. Online retailers are now competing in the time domain. The best reduction in time is walking into a store and making the purchase if the inventory is on hand. The bottom line up front for online retailers is that short of teleportation, there is an minimal time the delivery process takes despite the most lean supply chains. So how do the goods traverse from point A to point B in the amount of time reasonable for the customer that keeps the cost reasonable to the consumer?   A common sense answer is to economically ration the delivery time.

The supply chain cannot make promises without proper compensation to at least break even on the promise and the customer cannot make demands for which they are not willing to pay.  Pure economics 101 is the solution; supply and demand.  The customer should select the solution they are willing to pay for knowing the risk involved. As the holiday date approaches, shipping costs begin to increase in order to ration or guarantee on time delivery. These are typically overnight or expedited deliveries.   Then within a specified time of the holiday ending, the guarantee no longer persists as ordering overnight cannot be delivered Christmas morning. Some online retailers may already be doing this at some level. However, a more accentuated scale would attenuate to demand to something reasonable to manage. Of course, the down side to this is that sales are turned away due to costs or may not be there in time for Christmas. Retailers may need to balance risk and returns in assessing how each online retailer scales this. After all, consumers may recall that last year a company was late and look for another company to fulfill the need.

Of course, the hub and spoke method of forecasting then strategically staging inventory is another solution which runs a risk of overstock and some lengthy delivery times may persist in outlaying areas. So in realistic terms, the best delivery times that can be achieved is overnight for delivery on Christmas Eve. However, the customer must be willing to pay for that service.  Most logistical companies want their staff home with their families. So Christmas Day deliveries most likely will not occur. If that service is offered, I am certain that the customer will most likely pay for it dearly. 

Supply Chain Futures

On a more far out look at the temporal concern is Amazon’s robotic delivery solution which affects capacity and capabilities in which 'drones' or robots will make deliveries in lieu of human operated systems. There are numerous models for this type of service. I want to relate something I saw years ago that has some applications to the Amazon robotic delivery system. In an upscale city in the MidWest, the town’s folks did not want their garbage set out on the curb. They instead, set up a service in which the compactor truck slowly drove down the street while hopper carts raced up into the driveways where a trash receptacle was placed beside most homes. The trash was loaded into the hopper on the cart which then raced to the next home. When the hopper was full the cart pulled up to the compactor truck and dumped its load. Once the compactor truck was full, an empty one would replace it. The activity was a sight to see, as usually anywhere from 4 to 6 hopper carts were buzzing around collecting garbage in a synchronous dance with the compactor truck.  This approach comes to my mind with delivery services. Instead, the visa versa would hold true. Full vehicles would replace the emptied vehicles as the packages are disseminated by robots.

In the Amazon model, the robots may assume traditional logistical routes moving goods in the hub and spoke system between distribution points first. This could be something as simple as unmanned airborne vehicles, UAV’s, that fly dedicated routes dropping off cargo containers then picking up another container to fly to the next distribution point. These UAVs would operate 7/24 moving goods directly to the distribution points. Delivery services such as UPS, FEDEX, USPS, etc… would then humanize the service making personal doorstep deliveries.

Image 1: QuadCopter can be used to deliver packages
Source: Oberwelz Design Projects
Image 2: CL-327 Guardian - Counter rotating rotor heads
Source: Canadian UAV Trials
In a later generation of the robotic system, technologies exist for robotic surface and airborne vehicles to deliver goods to the doorstep as well. Hovering or flying courier vehicles either with quad rotor arrays, Image 1, or counter rotating rotors, Image 2,  could deliver the goods from an airborne or surface mother vehicle centrally stationed somewhat like the compactor trucks using swarm technology.  The actual courier vehicles will be of a different design and size than the example images as well as equipped with robotic arms and/or package grips for drop off in delivery receptacles on homes and businesses.  The advantage of quadcopters is stability. The advantage of a ducted counter rotating rotor design is quieter flight as the rotors are sound canceling.  Thus, 7/24 deliveries could be ongoing. The courier robot would then rejoin the mother vehicle once a delivery is completed for a telemetry download and load of the next package delivery.  Multiple courier robots could operate from a single mother vehicle. 

Full Circle

The supply chain can be leaned out further by having robots operating 7/24 moving goods around or by simply speeding up the supply chain using forecasts and strategic buffers.  Economically rationing delivery guarantees is another way to manage the demands of customers to those willing to pay for the quicker service guarantees. In the end, there may be a combination of solutions.  However, there are costs and temporal limitations to getting the goods in the hands of a buying public and online retailers are close to those limits short of teleporting the goods based on a thought in a buyers mind,

Customer, "Beam it over, Scotty!"  

Scotty, "Aye Captain, I am giving it all she'll take. It just won't go any faster."

This brings the Christmas bedlam full circle to the meaning and reason for the season which is to reflect upon the Judeo-Christian God and his character who is patient and shows temperance.  People should get back to the basics of the season, demonstrate patience and temperance, and all this bedlam can be put to rest. In the meantime, the supply chain will continue to seek ways of compressing the delivery time as that is competition. 

Wednesday, December 25, 2013

Information Theory Overview

Comment: Some time ago, I became interested in information theory partly due to my career and mostly because I began seeing elements of the theory popping up everywhere in movies, theological commentaries, war fighting etc... I studied the theory off and on purchasing books, watching movies, reading essays, and in general where ever I caught a wisp of the theory.  The interesting thing about truth is that it is self-evident and reveals itself in nature. So I did not have to look far. Although, a curious thing about information is noise or that which is distracting, like a red herring and there is plenty of noise out there. Anyhow, the point in this post is an information theory overview.  I would like to share basic information theory and relate it to the world around us. I will be coming back to this post updating and refining with more citations.

Information Theory

Information theory is relatively new and is part of probability theory. Like the core disciplines of mathematics and the sciences, information theory has a physical origin with broad spectrum applications. The theory has captured the attention of researchers spawning hundreds of research papers since its inception during the late 1940's. This new discovery has generated interest in deeper research that involves biological systems, the genome, warfare, and many other topical arenas. Claude E. Shannon PhD is the father of generalized information theory as developed during 1948 and theorized:

If the receiver is influenced by information from the transmitter then reproducing the influencing information at the receiver is limited to a probabilistic outcome based on entropy. 
Figure 1: Mathematical Formulation of Entropy (H) in a system
There are several terms in the thesis statement that may be difficult to grasp and the mathematical formulation, Figure 1,  may be overwhelming to some people who wonder how is entropy and information linked.  Entropy is an operative concept behind diminishing returns or the rate at which a system dies, decays, or falls apart. Entropy operates under order as formulated in figure 1. Thus, the phenomena is not random.  Within the context of information theory, entropy is the minimum size of a message before a meaning or value is lost. The notion of a probabilistic outcomes involves multiple possible results in which each result has a degree of uncertainty or a possibility that the result may not occur. For example, a rolling of the dice is limited to only six possible outcomes or results. The probability of any one outcome occurring is 1 in 6. The uncertainty in rolling the dice is high being 5 to 6 that any specific outcome will not occur.  As for the mathematical formulation, I will just leave that for general knowledge of what it looks like.

The thesis is pointing towards a 'lossy' system and promotes a simplistic communication model, Figure 2.
Figure 2: Simple Information Theory Model
From the thesis, formula, and model more complex related theories and models spawn coupling information theory to biology, quantum physics, electronic communications, crowds, and many other topical subject matter.  All fall back on entropy or the smallest message before it looses its meaning. The big question is so what? We will explore the 'so what' in the next section.

Information Theory Around Us

Most people fail to realize that information theory impacts us on an everyday basis. Aspects of the theory appear in movies, underpin all biological sensory capabilities, and appear in information networks in many ways. Many people philosophize that human and natural existence is purely information based. Let us explore information theory as it is exposed to many people. Most people have some familiarity with the sciences at some level, movies,  and religion.  Let us begin with a survey the sciences.

Atom smashing experiments during the 1970's lead to the discovery that the universe has boundary limits. According to physicist such as Richard Feynman, the father of quantum computing, matter ceases to exist at 10-32 meters. When matter ceases to exist so does space-time. Matter has dimensions and time's arrow spans dimensionality. When matter no longer exists neither does dimensionality and time is mutually inclusive. What remains are non-local waveforms or electromagnetic waves which are illustrated as strings that vibrate. The region where this occurs is the Planckian realm which is where matter is quantized or discrete having the qualities of a bit of information. Matter and energy are interchangeable based on the Theory of Relativity, Figure 3, and the wave-particle theory of light. Those vibrating waveforms in the Planckian realm slam together in a process of compactness that is not fully understood forming a particle having discrete size, weight, and possesses a positive (+),  neutral (0), or negative (-)  charge.   These particles then begin to assemble in a computational algorithmic manner based on the charge and tri-state logic into more complex particles from the sub-atomic into the physical realm. In the physical realm, complex molecules form such as DNA from which biological life emerges. 
Figure 3:  Theory of Relativity Formula
Energy = Mass x  (Speed of Light)2
The DNA is somewhat unique according to Microbiologist Dr. Matt Ridley. This is because not only did an computational information process arrive at the DNA molecule but injected into the DNA molecule are 4 bits of information ( G, C, A, and T ) which is used by nanites to endow biological life. Nanites are intelligent molecular machines that perform work and made out of amino acids and proteins. These molecular machines have claws, impellers, and other instruments. They communicate, travel, and perform work based on DNA informational instructions. The information process continues as even more information is applied against the DNA strand such as variation of timing, sequencing, and duration under which a gene fires. By varying the timing, sequencing, and duration of a firing gene  then specific features are managed on the life form under gestation.  Dr. Ridley quips the genome is not a blueprint for life but instead a pattern makers template having some sort of Genome Operating Device, a G.O.D (Ridley, 2003). The point here is that there is some sort of intelligent communication ongoing during the endowment of life and development of the natural universe. All of which are the outcome of computational processes and information.

During the 1960's extensive research was conducted into the operation of human biological sensory processes in relation to information theory.  The conclusion of this research determined that the sense of sight, sound, touch, smell, and taste undergoes an electro-chemical process in which information is encoded using Fourier Transforms into electrical waveforms. The base Fourier equations are somewhat ominous, Figure 4.
Figure 4: Fourier Transforms Equations
The equations are shown to see what they look like. Extensive mathematic philosophy and practical understanding of how these equation perform is necessary to appreciate these equations. In lay jargon, Fourier transforms encode and extract information embedded in a waveform.  These waveforms are constructed from the biological sensory devices; eyes, ears, nose, tongue, and skin. Once the information is encoded into a waveform the human brain stores information holographically.  Consider the operation of the eyes attached as part of the brain. The reason for two eyes is that they act symbiotically. One eye is a data source while the other eye acts as a reference source.  When the waveforms from these two sources collide then information is encoded in the constructive and destructive patterns that result. These patterns are then imprinted into the brain material to be recalled on demand as human's think in terms of images and not attributes. The human brain is capable of storing up to 6 terabytes of information. The eye has a curious tie to the quantum realm detecting a photon of light coincidental with the smallest instance of time, Planck's Time, which is of the order of 10-43 second.  This leads to the concept of quantum reality or that human perception is limited to the boundaries of the natural universe.

The human experience is said to be a digital simulation and the universe is computationally organized.  This lends credence to the creative license of writers and authors who imagine story lines such as The Matrix, Timeline, The Lawn Mower Man and countless others. But the concept of information in the human experience goes even further into theologies.

Information Theory and The Theological Argument 

Theologies are belief systems and inclusive of godless belief systems such as atheism and agnosticism to mono-theistic and polytheistic systems. Theologies are also inclusive of belief systems that deify idols and objects such as Pantheism which deifies the mother Earth. Cosmologies and philosophies are generally treated as a theology as well since the adherent's behavior is the same as any theological adherent. Religion is the practice of theological doctrines.  All theologies and related religions are information based. That is all theologies possess defining information effecting actions among the believers and/or reactions among non-believers. Some theologies are well organized and others are left to individual expression but all are based on some sort of information or lack of information. Adherents of any specific theology or religion gather, learn, and discuss the merits of their theological doctrines by communicating in order to increase membership or believers. 

Under information theory the Judeo-Christian Bible demonstrates many interesting qualities. The Judeo-Christian Theology advances one of the most comprehensive of all the theologies detailing a narrative from before-the-foundations-of-time to the end-of-time. The narrative begins not with the creation of the universe but a war between good and evil. Michael, the ArchAngel, defeats the rebelling forces lead by Satan. In the meantime, the Judeo-Christian God was busy creating the natural universe and gathered the defeated rebels casting them into the universe like a bolt of lightning effectively trapping evil in the universe or world.  The Judeo-Christian God then turned his attention to the souls, the believers, whom he knew before the creation of time and breathed their souls into human embodiments on Earth. Those human embodiments were architected with an image as well. The purpose of this was to test their loyalty through trials and tribulations having had the war break out in heaven. Evil, trapped in the world, causes the fall of humans into rebellion and the Judeo-Christian God declares war on evil as an outcome. Christ shows the believers the way by communicating the narrow path home. This epic war between good and evil in the world is actually an information war. 

Information warfare, IW, is a broad field of warfare that leverages ambiguity, deceit, innuendo, doubt, and truth in order to dominate the battle space or more correctly space-time.  IW tactics often involve hostile jamming,  man-in-the-middle attacks, psychological operations, in order to diminish, deter, deflect, or degrade the will of the opposition to fight. 

Satan is labeled the great deceiver. The Judeo-Christian God is truth or virtue. Satan placed doubt in Eve's mind about what God really said causing her to rebel against God and bring Adam into her fallen condition. God declared war on evil; Gen 3:15.  The Bible was then inspired by the Holy Spirit and scribed by humans as a 'message' to humans prescribing the way back home.  In the Bible, humans were given a mission, The Great Commission, to spread knowledge and information about the message of Christ.  In the meantime, other theologies either already operative in the world or spawning were designed to confuse the Biblical message or to conduct hostile jamming of the Biblical message.  For example, the Koranic Surah 9:5 is counter to Matt 26:52. The Koranic Surah 112 is direct denial of John 3:16.  These conflicting messages between theologies are commonly referred to as noise in information theory and in IW this is called hostile jamming due to intent. 

In response to noise, the Bible has been designed to pre-empt hostile jamming by delivering the messages multiple ways and times. Thus, by having a high signal-to-noise ratio jamming attempts can be defeated. The more important the message the more ways and times the message is repeated.  The Bible tactically uses parables to encrypt messages that humans who have the Holy Spirit, the decryption key, in them can understand. The Bible also tactically makes use of language being written originally in Greek and Hebrew. Greek is very concise and cannot have mistaken translations. Hebrew makes use of the highest language compression ratio communicating messages both phonetically and pictographically.  Hence, the use of many ways and means of getting the message through is an IW tactic. 

Under the auspices of information warfare, the war can be fought and won solely on the basis of information without committing conventional ground forces to battle. However, the final epic battle in Christian Eschatology is fought and personally won by Christ who is the victor over all evil.  This becomes necessary because those human hearts are hardened and many humans become demonically possessed such that their ears do not hear and eyes do not see. Thus, all communications have failed.

Additional support for the information perspective is the Judeo-Christian God spoke the universe into existence or communicated information; Gen 1:3. This act of speaking a universe into existence was built perhaps in the informational manner discovered by science and is ongoing today. Perhaps the Genome Operating Device varying the information of timing, sequencing, and duration of firing genes is the Judeo-Christian God's hand in the universe.  What ever the theology, the underlying premise is informational. Humans then decide to act based on the information they have or understand.

In conclusion, information theory affects us everyday in our world views, sciences, and personal beliefs leading to actions. Information theory hovers around the notion of a lossy system in which meaning is lost due to noise, interference, natural decay, and provocative measures.  Creative works have leveraged information theory to advance ideological movements.  Even the human life form may very well be the result of pure information causing humans to rethink  theologies and established beliefs in a new light. 

References:

Knuth, D. (2001). Things a computer scientist rarely talks about. CSLI Publications: Stanford.

Moser, S. and Chen, P. (2012). A student's guide to coding and information theory. Cambridge University Press: United Kingdom.

Reza, F. (1994). Introduction to information theory. Dover Books: New York.

Ridley, M. (2003). Nature via Nurture. New York: Harper Collins Publishers.

Tuesday, December 24, 2013

ZEO Folds Under

ZEO is a sleep tracking product that provided online tracking history and advice regarding sleep patterns. The service operated for about five years then terminated services and closed the business. The CEO cited the business model was problematic and not sustainable.  The CEO's remarks struck a cord with me as I used the ZEO product as well as the Fitbit product and others such as Withing's products too. My experience with these products and services as well as knowledge I have built in these blog postings combine in ways that may indicate business advantage. 

I have included the article directly in this posting for quick reference. I also want to direct readers to the Art of Profitability posting for some background in what I am about to discuss.  My purpose is not to second guess ZEO's leadership but instead to explore alternative ways of looking at this problem set. The overarching objective is for the  service and product to be profitable. 

Figure 1: Withings Aura
28APR14 Update: Withings is coming out with a product line similar to my discussion below. The Aura tracks sleep nearly identical to ZEO and adds integration with other physiology.  The product appears more comfortable to use and is cooler looking too, Figure 1. When I first bought Withings' Blood Pressure collar product the only other product was the smart scale. Withings has expanded the line into a FitBit like product called Pulse O2 and some products for baby monitoring. Now if Withings can standardize the interface and integrate with healthcare providers then they would have a much strong service and market. They need to add products for self monitoring blood sugar, Ph, temperature, etc... Then the line would improve self health monitoring tremendously. 

ZEO Folds Under

In the article, the CEO cites commoditization of the service and device as a major contributor to ZEO challenges but the real issue was the business model. According to the CEO there were only two options; 1) A SAAS-like business with recurring revenue streams or 2) place all business performance into a single unit sale (Orlin, 2013).  The CEO may have been a bit myopic. Business models are nice-to-have but they do not make money. Profit models on the other hand are how a business makes money. 

There are only 25 profit models that detail how a business makes money.   Therefore, ZEO is not constrained by the business models but instead the profit models.  Business models differ from profit models as business models are more about the operational design rather than the profit design. Understanding how a company makes money is more important than attempting a business design, a common mistake. For example, the auto industry makes money using the installed based profit model. The auto makers attempt to increase the numbers of vehicles in operation in order to profit from repair costs as well as secondary and tertiary after markets. The size of the installed base, numbers of vehicles sold or in operation, establishes the value the automakers can sell rights to manufacture the parts and provide services in the aftermarket. The business model is a central supplier of the installed base to independent and company owned dealerships who in turn sell to the install base and provide services to the market

Using more than one profit model is common in the business model. Returning to the automaker example, Brand profit is also employed. The market has loyalist to the brand. Another profit model is the multi-component profit in which people accessorize the vehicle with fog lamps, leather seats, spoilers, etc... The automakers have figured this out and carefully designed the business model around the profit models. The ZEO product is good but perhaps incorrectly approached. Let me explain as a user of the product and business minded person. 

ZEO was perhaps focused too strongly on a customer solution profit solely selling services to monitor peoples sleep patterns. In my case, sleep had been one issue in a host of concerns. While generally in good health, I had joined a local fitness club and was self-monitoring a breadth of vitals having purchased the Withing BP-100 blood pressure collar (Image 1), the FitBit activity tracker (Image 2), WiFi scale (Image 3), and the ZEO sleep monitor (Image 4).  I was also using software that tracked my exercise more more intensely than FitBit. I found the iTouch app Runtastic served my purposes best (Image 5). Like ZEO, Runtastic provided improved accuracy and detail over the FitBit. 

Image 1: Withings BP-800 Image 2: FitBit Image 3: Aria WiFi Scale
Image 4: ZEO Sleep Tracker Image 5:  Runtastic

In addition to the products and focused services offered, I also sought a means to manage all the information collected. Most of the competitors offered a website and premium services but only their products were serviced for electronic uploads. Everything else was a time consuming manual entry. Likewise, the Veteran Administration's myHealtheVet website, Image 6, offered a feature for vital readings in which the products collected some of the data but again it was all time consuming manual upload. Commercial services were available for collating the information but were often costly and again required manual entry. 

Image 6:  My HealtheVet Website Application
Looking at this broader picture than a niche market, ZEO was not much different than other services in terms of the business model. ZEO offered higher quality and accuracy than other sleep trackers. Speculating, the sleep tracking niche may be a smaller market than the other niche markets. Although the CEO indicated it was growing. Each of these niche products stand alone may achieve slim profit margins. The ZEO CEO was of the mindset the ZEO product and service had commoditized which is one of the greatest fears of a CEO.  However, if viewed from a broader market perspective, these products and services are really profit enhancers to the broader health maintenance market. With this in mind then other profit models begin to become more viable. 

One profit model that is attractive is SwitchBoard profit which is one stop shopping for complimentary products and services. The switchboard would hover around consolidation of information feeding blood pressure, activity, weight, and sleep tracking data into a comprehensive view. Supply point control would be achieved using brand saturation of the service in which websites like MyHealtheVet rebrands the service on their site.  Microsoft uses this profit model leveraging Open Systems Interfacing with their operating system becoming the dominant operating system in the marketplace. As healthcare companies, insurance, hospitals, and clinics readjust under the healthcare reforms opportunities are abound for a health maintenance provider to develop a switchboard profit opportunities. 

Many of the niche services remain focused on 'what they do good'. ZEO was an excellent sleep tracker. Runtastic is an excellent exercise tracker. Fitbit is good at weight and activity tracking and Withings is good at blood pressure tracking.  None of these are Block Bluster profits. However, they do fit into the Profit Multiplier/Enhancer profit model. Anyone of these niche markets can set up a portal that has Application Programming Interfaces for the other dissimilar niche products that could enhance their own product and sales.  For example, sleep problems are often in combination with other events. Hypertension, lack of exercise, weight, and other issues can cause sleep problems.  Tracking these collateral details could improve sleep. Also by rebranding the online services into insurance and healthcare provider websites could also provide additional revenue streams. 

Specialty Product profit is possible through branding the devices and services as approved by healthcare experts or certified by industry experts or insurance companies for healthcare monitoring. This adds value to the product increasing the earning potential. 

Field Force Morale profit is another model that can increase revenue streams. ZEO could create a swarm of evangelist who build local community groups and socialize the product. For example, many returning veterans have sleep issues for a variety of reasons. The Veteran's Administration may use the product non-medically  as a means of tracking sleep regularly having the data loaded into their Track My Health site. Veterans may meet in groups to discuss their sleep issues where the product is socialized.  

Overall, ZEO has the potential to toe into a greater market but may have prematurely exited. The project manager should facilitate the ideation process for the CEO seeking to build the business case and options. One approach the project manager should use is to pull together a tiger team in order achieve breakthroughs in thought that disrupt the marketplace and create opportunities for profit.  

In conclusion, ZEO should rethink market position of their service and product. The product is good and has merit among older folks and veterans who have sleep issues. The market can be expanded when thought of as a profit enhancer. 



Sleep Tracking Startup Zeo Says Goodnight
By Jon Orlin

One of the early pioneers in the Quantified Self movement has quietly gone out of business. Zeo, a leading maker of hardware and software used by consumers to track sleep and improve their health, has not been operating since the end of last year. A trustee has nearly completed the sale of all company assets. Zeo has been very quiet about the news up until now. In fact, Zeo’s website is still up and doesn’t mention the news.

Zeo was founded by three students at Brown University who had a passion for using the science of sleep and technology to improve people’s lives. The company introduced its first product, the Zeo Personal Sleep Coach, in June 2009.

The following week, the first article mentioning the term “Quantified Self” was published in Wired magazine. While the article didn’t mention Zeo, it did claim “a new culture of personal data was taking shape.” And that every facet of life from sleep to mood to pain was becoming trackable. “Even sleep – a challenge to self-track, obviously, since you’re unconscious – is yielding to the skill of the widget maker.”

In 2011, the widget maker Zeo introduced a mobile version to its Sleep Manager product line. By wearing a special headband, with sensors to measure electrical current, the Zeo could track different phases of sleep, such as Light, Deep and REM sleep, in addition to awake time. This data was then sent to an iPhone, iPod, or Android phone, and could be automatically uploaded to a personal and private online sleep database. This data along with some analytical tools could then be used to help improve your sleep and health.

What Went Wrong

Former CEO, Dave Dickinson, who lead the company for the past 5 years, tells TechCrunch the problem was not the brand or the product. In fact, the company was growing before it shut down.

Dickinson says the problem was the business model. “The business model is more important than the brand. Consumer health devices are a very capital-intensive business. You have to find enough money to address the consumer, funds to address the physicians, and also the retailers, and that’s up and above the device business having to fund inventory.”

Zeo had two business model options on the revenue side. Become a SAAS-like business with subscriptions and recurring revenue or make enough money from a customer who bought just one unit. But that was very difficult when the company started pricing its mobile product at $99, with ‘sub-optimal’ profit margins.

The Newton, Massachusetts-based company had raised more than $30 million over eight years. Dickinson says raising capital was not the problem.

Sleep Tracking As A Commodity

Another problem for Zeo was that sleep tracking became a commodity. Devices like the FitBit, lark, and Jawbone Up use an accelerometer to determine sleep and awake cycles, using wrist actigraphy. These products brand their products as sleep trackers just like Zeo.

Dickinson says Zeo had peer-reviewed scientific studies, including one published in the Journal of Sleep Research, showing his technology was 7/8th as accurate as data from the a sleep lab, considered to be the gold standard for measuring sleep. The study also says data from wrist actigraphy to measure tiny motions in devices are much less accurate. But that didn’t seem to matter for enough consumers.

The Competition

Dickinson says he admires what the Fitbit and others like it have done. Those devices are not limited to one health issue like sleep, which was another problem for Zeo. Those other products work for different health and wellness areas, such as the well established desire to lose weight and become physically fit. Consumers already spend billions of dollars to achieve those goals. And they are already educated and motivated to improve their weight and fitness.

Part of Zeo’s business model required it to educate the consumer on the importance of sleep and how sleep awareness and data can improve your health. Arianna Huffington, Editor-in-Chief of the Huffington Post, our AOL sister site, has been a crusader on the importance of sleep to your health. But according to Dickinson, “sleep is still lagging behind as important to your wellness. So in that respect, Zeo was early in terms of its mission.”

The Product

I used the device for several months last year and thought it was amazing. While wearing the headband took some getting used to, for me and my wife, the data it revealed was eye-popping. In addition to learning that I wasn’t getting enough sleep, which I knew already, I learned about the different types of sleep I was getting.

Most nights, I would get a half hour to an hour of “Deep Sleep” (dark green in the chart below) after going to bed. This is the phase of sleep the helps you feel restored and refreshed.

I would also see several periods of REM sleep, important for overall mental health, mood, and the ability to retain knowledge. The bulk of my time asleep, like most people, was spent in “Light Sleep,” which is better than not sleeping but doesn’t do as much for my health as Deep or REM sleep.

I was able to see graphics like this on my iPhone in the morning.

Here’s a good night with a sleep score of 90 out of 100 and more than 8 hours of sleep.


here’s a bad night, with a score of 47 with just 4 and a half hours of total sleep.


If I woke up in the morning during REM sleep, it was hard to get out of bed. If I didn’t get enough Deep Sleep, I didn’t feel I had a good night sleep.

Zeo claimed the real value of the program was I could get personalized online sleep coaching. But this required logging in to the website and entering more information about my sleep and other variables I wanted to track. If I could have entered the data right on my iPhone, I would have likely used it more. Since it required logging in on the website, it proved too much friction for me.

I also stopped wearing the headband after a while because it does feel a bit awkward. The former CEO says the company was aware the device was too invasive for some customers.

But if a less invasive sensor was made and it was easier to enter custom data and get actionable information, I would have used it every night.


What’s Next

Dickinson can’t comment on exactly what’s next for Zeo, after all the assets are sold. But he is hopeful that there may be an opportunity for the company to re-emerge in the future.

An article appeared in the MobiHealthNews in March, that reported the Better Business Bureau had listed Zeo as being “out of business” but with no official announcement by the company, the news hasn’t been widely known.

It is still possible to log-in to Zeo’s “My Sleep” site that contains your sleep data. An article on the Quantified Self website today tells users how they can download their data in case the site goes offline.

As word about Zeo’s status has spread, Dickinson says they have received tremendous support and inquires from all over the world from disappointed customers and sleep researchers who had planned to use the units for the research.

He wrote a post on the MobiHealthNews site last week that included some additional lessons learned. He concluded by writing “motivating behavioral change through data visualization can be very powerful, but it is more of an art than a science. We will need far more artists, user interface experts and psychologists to help make our data work harder to motivate better health.”

References:

Orlin, J. (2013). Sleep tracking startup zee says goodnight. TechCrunch. Resourced December 23, 2013 from
http://techcrunch.com/2013/05/22/sleep-tracking-startup-zeo-says-goodnight/ 

Slywotzky, A (2002) The Art of Profitability, Warner Business Books, New York.

Monday, December 23, 2013

Technical Brief Series

Comment: Many of the postings in this blog are of a technical nature but do not go too deep.  I have compiled these postings into this series for quick referencing purposes. Most of these technical briefs were part of a overarching effort to quickly train staff on specific subject matter.  I hope that you them useful.


Short Read Archives

Several years ago while in a leadership role running a operationalized telecommunication cell, I was challenged with a variety of knowledge levels within the staff as well as high staff turn over. Every quarter I was transitioning about 25% of the staff and observing 100% turnover annually. Reason was the cell was a interim duty in which staff got a check-in-the-box before moving on to their primary duty. Also up to 35% of the staff was augments that were assigned during crisis events. I needed a method to bring the staff up to speed and continually increase knowledge. After looking around, I drew upon operations management practices and my own educational experiences looking to McGraw-Hill's SRA program. SRA is the acronym for Science Research Associates, Inc but also has become known as Short Read Archive.

The SRA program was simple. SRA card series were created for self-paced learning with a frame work of benchmarks or milestones. The front side was a short read and the back side of the card was exercises and a quiz. Forms were completed for the quiz and turned into the instructor for grading. Some SRA card sets were self-graded and benchmark testing was graded by the instructor. I choose to employ the short read approach and coupled that to weekly scheduled training. Each staff member had a standard training plan and would draw the short briefs based on the schedule for them. The goal was to cross-train everyone under a training management program.

The original short reads were drafted by the staff then reviewed, smoothed, and published to the set. The idea was not to teach deep detail but create increasing familiarity with the systems and technologies in use. In all there were about 50 short reads with a cyclic monthly review of 4 of the short reads

In preparing these for post, I updated a few and consolidated some cards. Most likely I will not post all 50 short reads, just the more interesting ones. Most discuss the technology on a high level.
I will continue to add to this list over time. 

Saturday, December 21, 2013

Neural Agents

Comment: Several years ago, I was the leader of an operationalized telecommunication cell. The purpose of the cell was to monitor the effectiveness and readiness of the telecommunications in support of the ongoing operations. The staff regularly turned over due to the operational tempo and I had to train new staff quickly. I did so by preparing a series of technical briefs on topics the cell dealt with. This brief was dealing with  Neural Agents which I have updated and provided additional postings that paint a picture of potential advanced systems. 

Human Centric Computing: This discusses the way human interface with computational machines.

Knowledge Management Brief: This brief discusses KM and its importance in an organization.

Organizational Computational Architecture: This takes a unique look at computational power in an organization.

Chaos Strategy Part I A: This post looks at latency and how organizations got to get better at problem solving. 

Neural Agents

Figure 1: Agent Smith
The Matrix movie franchise
Neural Agents have this spooky air about themselves as though they are sentient and have clandestine purpose. The movie franchise the 'The Matrix', Figure 1, made use of Agent Smith as a artificial intelligence designed to eliminate zombie processes in the simulation and human simulations that became rogue such as Neo and Morpheus.  In the end, Agent Smith is given freedom that results in him becoming rogue and rebellious attempting to acquire increasing powers over the simulation. 

The notion of artificial intelligence has been around forever. Hollywood began capturing this idea in epic battles between man and machines in the early days of Sci-Fi.  More recently, the movie "AI" highlighted a future where intelligent machines survive humans. Meanwhile, the Star Trek franchise advances intelligent ships using biological processing and has a race of humanoid machines called the Borg.  Given all the variations of neural technologies, the Neural Agent remains a promising technology emerging in the area of event monitoring but not acting quite as provocative as Agent Smith. The latest development in neural agents in support of artificial intelligence. Neural agents, Neugents which are not related to Ted Nugent, are becoming popular in some enterprise networks. 

Companies can optimize their business and improve their analytical support capabilities as this technology enables a new generation of business applications that can not only analyze conditions in business markets, but they can also predict future conditions and suggest courses of action to take. 

Inside the Neugent

Neural agents are small units or agents, containing hardware and software, that are networked.  Each agent has  processors and contain a small amount of local memory. Communications channels (connections) between the units carry data that is encoded usually on independent low bandwidth telemetry. These units operate solely on their local data and input is received from over the connections to other agents. They transmit their processed information over telemetry to central monitoring software or other agents. 

The idea for neugents came from the desire to produce artificial systems capable of “intelligent” computations similar to those of the human brain. Like the human brain, neugents “learn” by example or observations. For example, a child recognizes colors by examples of colors. Neugents work in a similar way: They learn by observation. By going through this self-learning process, neugents can acquire more knowledge than any expert in a field is capable of achieving.

Neugents improve the effectiveness of managing large environments by detecting complex or unseen patterns in data. They analyze the system for availability and performance. By doing this, neugents can accurately “predict” the likelihood of a problem and even develop enough confidence over time that it will happen. Once a neugent has “learned” the system’s history, it can make its predictions based on the analysis, and it will generate an alert, such as: “There is a 90% chance the system will experience a paging file error in the next 30 minutes”.

How Neugents Differ From Older Agents

Conventional or older agent technology requires someone to work out a step-by-step solution to a problem then code the solution. Neugents, on the other hand, are designed to understand and see patterns, to train. The logic behind the neugent is not discrete but instead symbolic.  They assume responsibility for learning then adapt or program themselves to the situation and even self-organize. This process of adaptive learning increases the neugent's knowledge, enabling it to more accurately predict future system problems and even suggest changes.  While these claims sound far reaching, progress has been made in many areas improving adaptive systems. 


Neugents get more powerful as you use them. The more data it collects, the more it learns. The more it learns, the more accurate its predictions. This solution comes from two complimentary technologies: the ability to perform multi-dimensional pattern recognition based on performance data and the power to monitor the IT environment from an end-to-end business perspective.

Systems Use of Neugents and Benefits

Genuine enterprise management is built on a foundation of sophisticated monitoring. Neugents apply to all areas. They can automatically generate lists for new services and products, determine unusual risks and fraudulent practices, and predict future demand for products, which enable businesses to produce the right amount of inventory at the right time. Neugents help reduce the complexity of the Information Technology (IT) infrastructure and applications by providing predictive capabilities and capacities. The logic behind the neugent is not discrete but instead symbolic. 

Neugents have already made an impact on the operations of lots of Windows Server users who have already tested the technology. They can take two weeks of data, and in a few minutes, train the neural network. Neugents can detect if something’s wrong. They have become a ground-breaking solution that will empower IT to deliver service that today’s digital enterprises require.

With business applications becoming more complex and mission-critical, the us of neugents is more necessary to predict then address performance and availability problems before downtime occurs. By providing true problem prevention, Neugents offer the ability to avoid the significant costs associated with downtime and poor performance. Neugents encapsulate performance data and compare it to previously observed profiles. Using parallel pattern matching and data modeling algorithms, the profiles are compared to identify deviations and calculate the probability of a system problem.
Conclusion

Early prediction and detection of critical system states provide administrators an invaluable tool to manage even the most complex systems. By predicting system failures before they happen, organizations can ensure optimal availability. Early predictions can help increase revenue-generating activities as well as minimizing the associated costs due to system downtime. Neugents alleviate the need to manually write policies to monitor these devices.

Neugents provide the best price/performance for managing large and complex systems. Organizations have discovered that defining an endless variety of event types can be exhausting, expensive and difficult to fix. By providing predictive management, Neugents help achieve application service levels by anticipating problems and avoiding unmanageable alarm traffic as well as onerous policy administration.

Human Centric Computing

Commentary: This post was originally written Dec 2010 and was updated then reposted Dec2013. Chief Executive Officers fear commoditization of their products and services. This is a major indicator of attenuating profit margins and a mature market. Efforts such as "new and improved" are short term efforts to extend a product life cycle. A better solution is to resource disruptive technologies that cause obsolescence of standing products and services, shift the market, and create opportunities for profit. Human centric computing does just that and project manager may find they are involved in implementing these projects of various levels of complexity. Thus, project managers should have a grasp of this technology and even seek solutions in their current projects. 

Neural Agents: This posting discuss the use of agents to solve problems.

Knowledge Management Brief: This brief discusses KM and its importance in an organization.

Organizational Computational Architecture: This takes a unique look at computational power in an organization.

Chaos Strategy Part I A: This post looks at latency and how organizations got to get better at problem solving. 

Human Centric Computing

Human centric computing has been around for a long time. Movies for decades have fantasized and romanticized about sentient computers and machines that interfaced with human beings naturally.  More recent movies have taken this to the ultimate end with characters such as Star Trek's Data, Artificial Intelligence A.I.'s character David, or the movie iRobot's character Sonny.   In all cases these machines developed self-awareness or the essence of what is considered to be uniquely human but remained machines.  The movie Bicentennial Man went the opposite direction from a self-aware machine who became human. This is fantasy and there is a practical side to this.

Michael Dertouzos in his book, 'The Unfinished Revolution', discusses early attempts at developing the technologies behind these machines. The current computational technologies are being challenged as the Unfinished Revolution plays out. I am not in full agreement with the common understanding of the Graphical User Interface, GUI, as "a mouse-driven, icon-based interface that replaced the command line interface". This is a technology specific definition that is somewhat limiting and arcane in thinking. A GUI is more akin to a visceral human centric interface in which one form utilizes a mouse and icons. Other forms use touch screens, voice recognition, and holography. Ultimately, the machine interfaces with humans as another human would in the end state.

Human Centric Computing

Humans possess sensory capabilities that are fully information based. Under the auspices of information theory, during the 1960's human sensory has been proven to be consistent with Fourier's Transforms. These are mathematical formulas in which information is represented by signals in terms of time domain frequencies. In lay term, your senses pickup natural events and biologically convert the event to an electrical signals in your nervous system. Those signals have information encoded in them and arrive at the brain where they are processed holographically. The current computational experience touches three of the five senses. The visceral capability provides the greatest information to the user currently because the primary interface is visual and actually part of the brain. The palpable and auditory are the lesser utilized with touch screens, tones, and command recognition. The only reason smell or taste is used is if the machine is fried in a surge which leaves a bad taste in one's mouth. However, all the senses can be utilized since their biological processing is identically the same. The only need is for the correct collection devices or sensors.

Technological Innovations Emerging

If innovations such as these device examples are fully developed and combined with the visceral and palpable capabilities cited earlier truly human centric machines will have merged and the 'face' of the GUI forever will have changed.

Microsoft's new Desktop is literally a desktop that changes the fundamental way humans interact with digital systems by discarding the mouse and keyboard altogether. Bill Gates remarks that the old adage was to place a computer on every desktop. Now Gates remarks that Microsoft is replacing the desktop completely with a computational device. This product increases the utilization of the palpable combined with the visceral in order to sort and organize content then transfer the content between systems with the relative ease of using the finger tips (Microsoft, 2008). For example, a digital camera is set on the surface and the images stored are downloaded then appear as arrayed images on the surface for sorting with your fingertips. Ted Talk's Multi-touch Interface highlights the technology. 

Another visceral and palpable product is the Helio Display. This device eliminates the keyboard and mouse as well. Image appears in three dimensions on a planar field cast into the air using a proprietary technology. Some models permit the use of one’s hands and fingers in order to ‘grab’ holographic objects in mid-air and move them around (IO2, 2007). Another example of this concept is Ted Talk's video Grab a Pixel

On the touch screens of various forms virtual keyboards can be brought up if needed. However, speech software allows for not only speech-to-text translation but also control and instructions.  Speech engines that can provide high quality instructions replacing error tones and help text. Their telephony products are capable of interaction with callers. Their software also comes in 25 languages. (Loquendo, 2008).

There are innumerable human centric projects ongoing. In time, these products will increasingly make it to the market in various forms where they will be further refined and combined into other emerging technologies. One such emerging trend and field is a blending of Virtual Reality and the natural.  The Ted Talks video 'Six Sense' illustrates some of the ongoing projects and efforts to change how we interconnect with systems.

Combining sensory and collection technology with neutral agents may increase the ability to evaluate information bring the computer systems closer to self-awareness and true artificial intelligence.  Imagine a machine capable of intaking an experience then sharing that experience in a human manner. 

Commentary:  Project managers seeking to improve objectives where selection and collection of information can be quickly gathered with out typing or wiping a mouse across the screen, should consider use of these type of products whenever possible.  Although costly now, the cost for these technologies will drop as the new economy sets in. 

References:

Dertouzos, M.L. (2001). The unfinished revolution: human-centered computers and what they can do for us.  (1st ed.), HarperBusiness 

Englander, I. (2003). The Architecture of Computer Hardware and Systems Software: An information Technology Approach. (3rd ed.). New York: John Wiley & Sons Inc.

IO2 Staff Writer, 2007. HelioDisplay, Retrieved 25FEB09 from http://www.io2technology.com/

Microsoft Staff Writer, 2008. Microsoft Surface, 2008, Retrieved 25FEB09 from
 http://www.microsoft.com/surface/product.html#section=The%20Product

Loquendo Staff Writers, 2008. Loquendo Corporate Site, Retrieved 25FEB09 from
 http://www.loquendo.com/en/index.htm

Voice Over InternetWorking Protocol (VOIP) Technical Brief

Comment: Several years ago, I was the leader of an operationalized telecommunication cell. The purpose of the cell was to monitor the effectiveness and readiness of the telecommunications in support of the ongoing operations. The staff regularly turned over due to the operational tempo and I had to train new staff quickly. I did so by preparing a series of technical briefs on topics the cell dealt with. This brief discuss VOIP and its potential vulnerabilities.

Voice Over InternetWorking Protocol (VOIP) 

VOIP is the set of standards that defines management of voice signals sent over the Internet. The principle difference between VOIP and the traditional phone systems is that VOIP transmits using discrete digital packets instead of analog signals.

There are many technical challenges to making VoIP work effectively that focus on bandwidth, timing of the voice digital packets, order of packet arrival, and extra packets resulting from network flooding. These challenges could reduce the effectiveness of the technology when operating over longer distances but are minimized when operating shorter distances. Hence, VOIP is on classic LANs and tends to remain localized. In order to achieve broad area communications the VOIP technology makes use of the Public Switched Telephone Networks, PTSN.  A VOIP server acts as a gateway between the LAN and the PSTN, Figure 1. There are pure VOIP network that are completely TCP/IP but the long distance transmissions are trunked into a TCP/IP carrier such as a T1 or some other point-to-point TCP/IP based trunk.

Figure 1: VOIP System use of PSTN
Source: Wallingford, T. (2005) Switching to VOIP. (1st e.d.) O'Reilly Media: USA,  p 29.
Vulnerabilities:  Many of the same threats that affect computer networks also affect VoIP. The Denial-of-Service (DoS) attacks prevent access to computer network resources including VoIP and typically overwhelm network services by choking transmissions. Some DoS attacks include:
  • Application DoS Attack: The goal of this type of attack is to prevent users from accessing a network services by forcing the service to fulfill overwhelming transactions. Also known as spamming. For example, flooding a web server with service request.
  • Network DoS Attack: An attack that sends large amounts of data overwhelming the victim network infrastructure. A ping flood attack is one example.
  • Transport DoS Attack: This attack targets the operating system by sending an excessive number of connection requests causing the system to lockup.
  • Man-in-the-Middle Attacks require access to the victim network either by ‘tapping’ a physical path on the network or reception of radio frequency traffic. Some of these attack include:
  • Manipulation: The ability to collect, modify, and then re-transmit modified data.
  • Eavesdropping: Illicit unauthorized receipt of a data communication stream for the purpose of analyzing and monitoring.
  • ARP Poisoning: The ability to force network traffic through a malicious machine by associating the hostile machines MAC address with the legitimate machines IP address thus impostering the victim.
  • Packet Spoofing: Impostering of a legitimate user. This is often automated and user level access is not available.
  • Replay: The retransmission of a genuine message so that the device receiving the message can reprocess it.
VoIP, in general, is vulnerable to two categories of threats, internal and external. External threats, such as DoS and Man-in-the-middle attacks, are by a third party who is outside the VoIP conversation. VoIP conversations are most susceptible to these external threats when the packets are traversing the Internet or untrustworthy networks and devices. Internal threats are more complicated and originate from a VoIP conversation participants. They violate a trust relationship from behind a firewall and expose the system to a number of threats. Some examples of internal threats are listed below:
  • Trivial File Transfer Protocol (TFTP) eavesdropping is a risk of VoIP. Normally, TFTP is used to transmit system maintenance files unencrypted. Exploitation of this feature exposes the system to delivery firmware that opens vulnerabilities to enumerate the computer network.
  • Some systems use dynamically assigned IP addresses. A vulnerability of impostering a legitimate user, known as IP spoofing, could exist. Also the server that assigns dynamic IPs or the Dynamic Host Configuration Protocol (DHCP) server may be exposed to common network attacks.
  • VoIP conversations are inserted into Real-time Transport Protocol (RTP) media stream to manage the conversation and overcome some of the VoIP technical challenges. This opens an opportunity for exploitation since the conversation is unencrypted unless a virtual private network (VPN) is installed protecting the conversation.
  • Telnet could allow access to the system if not disabled on the end user machines.
Overall, VOIP offers some benefits and some vulnerabilities. In comparison to the traditional phone system operating under Signaling System Seven, SS7, on Public  Switched Telephone Networks, PSTN, VOIP is lower cost but does not provide the long distance reliability of the PSTN in most cases unless dedicated and costly point-to-point TCP/IP based trunks are provided. 

References

Wallingford, T. (2005) Switching to VOIP. (1st e.d.) O'Reilly Media: USA,  p 29.