Saturday, November 27, 2010

Business Intelligence


Commentary: Many vendors offer business intelligence solutions. Project managers need to understand not only the vendor application but more importantly the business strategic objectives.   

The article “Best Practices for Great BI Performance with IBM DB2 for I” discusses another vendor solution to the general information science problem of data storage. Practices such as Business Intelligence, BI, relate more to contextual presentation of data patterns. How data is stored, retrieved, then processed on demand is essential to good BI. BI is a subset of the broader discipline of Decision Support Systems, DSS, which are at the top of the information food chain. Information sub-systems collect, sort, validate, and store data before passing the information to the next level. As the data moves up through the levels the quality of the data improves.

Acquiring quality data for DSS systems and applications begins at a much lower level. Operational sub-systems collect data and business rules validate then store data usually in several different relational databases. Typical operational sub-systems data involves data entry of customer data, employee time clock data, repair data, bookkeeping, and logistical information collected by scanners. Information across these databases are then gathered or organized in support of operational level processes which add value to the data through operational processes. Typical operational processes involve activities such as purchase orders, payroll, travel, customer service, financial statements etc… Operational level data across numerous systems and databases is then rolled up into a DSS system. DSS level processes are dramatically different than operational processes. DSS processes look at the character of the data sets, for example trends, patterns, and behaviors, in order to form strategic decisions. Because of the large data sets involved in DSS; storage, processing, and reporting of the data is critical in order to meet on demand for review requirements in a timely manner.

The common approach currently in use is the data mart that are working subsets of larger primary databases and present a unique view. These data marts when organized in ways to permit multi-dimensional modelling of the operations are called data cubes. The use of methodologies such as online transaction processing (OLTP) and online analytic processing (OLAP) can continuously usher data into the data cubes in support of on demand reviews. Numerous vendors are beginning to offer services in this arena although the methodologies and markets are still being shaped. “BI will surge in the mid market” (Cuzillo, 2008) “In the last two years or so we have seen some important new technologies emerge and begin to influence BI, and I believe they’ll have an even more significant effect in the coming year. Some examples include SOA/Web services (and overall componentized design), in-memory analytics, integrated search, and the use of rich media services to provide more compelling (Web-based) user experiences.” (Briggs, 2008)

Commentary Decision support systems are a growing business interest as markets become increasingly volatile. Possessing a fundamental understanding of these systems and their value to business is pinnacle to architecting effective systems. Many businesses continue to struggle with the best way to employ information technologies and serve business intelligence needs. Project manager's implementing these kinds of projects need to be involved right from the inception in order to maintain a focus on the strategic objectives. It is the project manager who corals and focuses these projects for the senior leaders and achieving their visions.  

Reference:

Englander, I. (2003). The Architecture of Computer Hardware and Systems Software: An information Technology Approach. (3rd ed.). New York: John Wiley & Sons Inc.

Cain, M. (2008). Best Practices for Great BI Performance with IBM DB2 for i. Penton Media, Inc. Retrieved from http://www-03.ibm.com/systems/resources/Great_BIPerformance.pdf

Cuzzillo, Ted, Dec 2008. Analysis: BI Transformation in 2009, TDWI, Retrieved from http://www.tdwi.org/News/display.aspx?ID=9262

Briggs, Linda, Dec 2008. Q&A: Market Forces That will mold BI in 2009, TDWI Retrieved from http://www.tdwi.org/News/display.aspx?ID=9263

Healthcare Information Virtual Environment System (HIVES)

Healthcare information systems have a vast array of various equipment, clinics, labs, governmental agencies, manufacturers, doctor offices, and innumerable other organizations providing, collecting, and processing information.  Classic issues of stove piping or 'Silos' have emerged causing inefficiencies in the industry such as multiple lab test and/or diagnostics being prescribed. The advent of a nationalized health records system increases the complexity of these networks as well. In order to gain management and control over these information systems, the American National Standards Institute (ANSI) hosts the Healthcare Information Technology Standards Panel, (HITSP). This is one of several cooperative efforts, between industry and government to create standards. However, all too often the standards result in a highly complex architecture and system design. This is because early standards and architectures often focus on resolving major issues with little forethought into the broader architecture. Many argue that little information is known or that the project is far too complex. Years later, this results in an effort to simplify and streamline the system again.

Allowing a Frankenstein architecture to emerge would be a travesty when our initial objectives are to streamline the healthcare processes removing redundancies and latencies in the current system. The planners should design the system for streamlined performance early. Large scale projects like these are not new and history tells us many good things. The evolution of complex systems such as the computer, the car, and the internet have emerged out of a democratization of design. Literally, tens of thousands of people have contributed to these systems and those models are one approach to resolving the large scale complex information systems involved in healthcare. What we have seen emerge out of the democratization of design is a standardization of interfaces in a virtualized environment. For example, the headlamps are nearly identical for every car with standard connectors and mounts even though the headlight assemblies are artfully different on each car. The computer has standard hardware and software interfaces even though the cards and software perform different functions. The virtual computer is independent of vendor product specifications. Instead, the vendor performs to a virtual computer standard in order for their products and services to function properly.

Let us take a moment to explain that virtualization is the creation of a concept, thing, or object as an intangible structure for the purpose of study, order, and/or management. The practice is used in across a breadth of disciplines to include particle physics and information science. Within the information realm, there are several different virtualization domains to include software, hardware, training, and management virtualization. My interest is not in the use of any specific virtualized technology but instead in exploring healthcare virtualization management.

I propose a need for a Healthcare Information Virtual Environment System (HIVES), Figure 1, which is essential to reducing complexity and establishing a standard for all those participating in the healthcare industry. The virtual environment is not a technological system. Instead it is a management system or space in which medical information is exchanged by participating objects within the virtual environment. Real things like clinics, offices, data centers, and equipment sit on the virtualized backplane or space.  HIVES would have a set of standards for participating equipment, clinics, hospitals, insurance agencies, data centers, etc... connecting to the environment in order to exchange information. Many may remark that these standards exist. I am able to locate dozens of vendor products and services supporting hardware, software and even service virtualization which are not a standard virtualized management of the overarching healthcare environment that is what the nationalized healthcare system is attempting to manage. I have reviewed HITSP and noted there is no clear delineation of a virtualized managed environment.

Figure 1: HIVES


In such an environment, I envision data being placed into the environment would have addressing and security headers attached. In this way, data is limited to those listening and who have authorization to gather, store, and review specific information. For example, a doctor prescribes a diagnostic test. An announcement is made in the environment of the doctors request addressed to testing centers. Scheduling software at a testing facility participating in the environment picks up the request then schedules the appointment. It announces the appointment in the virtualized environment in which the doctor's office software is listening to receive the appointment data. Once the patient arrives the machines perform the diagnostics placing the patient's data back in the environment. A analyst picks up the record reviews it and posts the assessment in the environment. In the meantime, a data center participating in the environment that holds the patient's record is listening and collects all new information posted in the environment regarding the patient then serves those records to authenticated requests. The patient returns to the doctors office which request the patient's record from the data center through the environment. 

The advantages to having such an environment whether called HIVES or something else are astronomical. The patient's records are available to all participating in the environment, security levels and access can be administered in the environment efficiently to ensure HIPPA and other security compliance standards, bio-surveillance data is more readily available with higher accuracy in the data centers, the environment can be an industry driven standard and managed through a consortium, and the government could be an equal participant in the environment.  

Moreover, to be a participant, the manufacturer, clinic, lab, hospital, doctor office, data center or any others have to meet the clearly defined standards and become a consortium participant at some level. Thus, complexity of the architecture and systems interfacing can be tremendously reduced achieving the stated objectives of healthcare reform and streamlining.

Commentary:  Please feel free to comment and dialogue on this concept. I would especially enjoy commentary regarding the standards and any efforts at virtualized management of health care information. 

Sunday, November 21, 2010

Impacts of Complexity on Project Success

Commentary: This is the relevant portions of an extensive paper written in my Masters of Information Technology coursework.  The paper highlights a common concern among many project managers. That is the lack of quality information early in a project especially in complex projects.  The overall paper proposed research into project complexity and early planning efforts.

Introduction

Project management practice and principles have been maturing and continue to mature. The general paradigm to plan well applies to early project planning and has a significant influence on the success or failure of a project. This research is in support of identifying the key relationships between influential factors affecting scope and risk in complex projects during early project planning. Attention to the complexity is important since the nature of information technology, IT, projects are complex. Complexity tends to increases risk. "Project abandonment will continue to occur -- the risks of technology implementation and the imperfect nature of our IT development practices make it inevitable" (Iacovou and Dexter, 2005, 84). Therefore, this study is focused on the early information technology project planning practices when the project is vague and the outcomes are unknown and unforeseen. The purpose is to better manage scope gap early.

Problem Statement. Poor scope formulation and risk identification in complex projects during the early planning have lead to lower project performance and weakened viability. Therefore, project managers are challenged to manage these issues early in order to increase the project's viability and success.

Argument.  Project complexity influences performance just as taking shortcuts in a rush for results causes an outcome with complexity like characteristics. Lower performance outcomes may result from project essential factors relating to scope and risk objectives that are overlooked or not properly managed resulting in increased cost, delays, and/or quality issues that jeopardize the projects viability and success. 

Body of Works Review

This effort intends to explore the significant body of works that has emerged to date. Literature research was conducted across a diversity of project types in support of the research problem statement that poor scope formulation and risk identification of a complex project during the early planning affect project performance and project viability in relationship to complexity of the project. This is by no means the first time research of this nature has been explored in these three areas; scope definition, risk identification, and project complexity. 

The common threads in the body of works that has emerged spans decades to include project management as a whole, risk and scope factors that affect project success, information and communications challenges, and complexity impacts on scope and risk. The works researched in other disciplines provide many transferrable lessons learned. For example, construction and engineering projects have in common to information technology projects complexity issues as well as information reporting and sharing concerns. Other works from supporting disciplines contribute to factors on education, intellect, and learning in support of competency influences on risk. A 2001 trade publication article indicated that causes for failed projects tend to be universal.  The article's author, John Murray, concludes that information technology projects fail for a small set of problems rather than exotic causes (Murray, 2001, p 26-29).

In a 2008 construction project study, the researchers discussed the construction industries front end planning which is explained as the same as the project charter process. The works details a study of fourteen companies and their project planning processes then presents a model process. The study results are summarized into critical criterion of success. In conclusion, fifty percent of the projects did not have required information for front-end planning activities. Problem areas were identified in a follow on study to include weak scope and risk identification as well as other basic issues (Bell and Back, 2008).

The problems of scope definition researched in the body of works indicates that cooperative planning and information sharing have been key factors in developing scope. A 2007 study on concurrent design addressed the complexities and risk of concurrent design projects. The researchers posed a model of interdependent project variables. The linkages illustrate the direction of the communications or information sharing between the variables. In the researcher's analysis they conclude that through cooperative planning in the form of coupling and cross-functional involvement significantly reduce rework risk. Early uncertainty resolution depends on cross-functional participation (Mitchell and Nault, 2007).

The Technology Analysis and Strategic Management Journal published an article in 2003 discussing outsourcing as a means of risk mitigation. The outcome of the case under review was project failure due to a lack of clear requirements and poor project management. This was attributed to conflict and a loss of mutual trust between the outsourced vendor and the information technology client. The result was one vendor cutting losses due to weak commitment when compared to in house project support. The researcher suggested that shared risk may be effective in a partnership such as outsourcing but requires strong communication and some level of  ownership (Natovich, 2009, p 416).  This article's case study illustrates that cooperation is critical in information technology projects. A 1997 study discussed mobilizing the partnering process in engineering and construction projects during complex multinational projects. Researchers argued developing project charters fostered stronger partnerships and reduced risk. In general, the article promotes a shared purpose supported by a method based on vision, key thrusts, actions, and communication. The works offers management practices and predictors for conflict resolution and successful projects. One of the best predictors of success in high performance project managers is the ability to reconcile views rather than differentiate; influence through knowledge; and consider views over logic or preferences (Brooke and Litwin, 1997).

The literature has also indicated competencies of project members and conflict resolution have been key factors of interest. Northeastern University's explored strengthen information technology project competencies having conducted a survey of 190 employers finding that employers considered hands on experience, communications ability, and behavioral mannerism of the student among other attributes. The researcher makes a call for a mixture of improvements to student curriculum that involves project management skills both visionary and hand-on as well as group interaction (Kesner, 2008).  The efforts to strengthen competencies have not only been in traditional education institutions but also in professional organizations such as the American Institute of Certified Public Accountants (AICPA). A 2008 article discussed the accounting industry's approach to identifying and correctly placing information technology staff based on assessed competency levels. The AICPA is using a competency set that is found cross industry and levels of skill ("IT Competency", 2008).  Some dated literature is also indicating that in order to solve vague problem sets within complex project has centered on a willingness and ability to engage the vague circumstances, to think abstractly.  A 1999 psychology publication discussed the typical intellectual engagement involving a desire to engage and understand the world; interest in a wide variety of things; a preference for complete understandings of a complex problem; and a general need to know. The study associated intellect with the typical intellectual  engagement their environment in an intellectual manner, problem solve, believe they possess greater locust of control over the events in their lives (Ferguson, 1999, p 557-558).  Additional research is necessary in this area with this work being so dated.

In a 2006 article researchers sought to understand reporting to senior manager methodology regarding software development projects. The works discussed reporting and governance in an organization then break into four functional areas and further refine the best practices into a common view.  The researchers noted that little attention has been given to how senior managers and the board can be informed about project progress and offered several method of informing them. The researchers reported that senior managers need information grouped into three classes; support decisions, project management, and benefits realization assessments. The researcher then discusses a variety of reports and their attributes. The researchers concluded that senior managers and board members need effective reporting if they are to offer oversight to the software development project (Oliver and Walker, 2006).  Another 2006 study indicated that continuous reporting, information sharing, builds the case for compelling board member involvement based on four factors: cost overrun history, material expenditures, [software] complexity, and any adverse effects on the company (Oliver and Walker, 2006, p 58).

The challenges of project complexity management have utilized information technology governance as a key factor in project success.  Information technology governance has been sought as a framework to align organizational goals with project goals.  In a 2009 qualitative study, researchers sought to treat information technology governance, change management, and project management as closely related then stated a premise that information technology governance must be governed to ensure that problems due to weak governance are corrected.  They postulate the question how much information technology governance is a requirement. Then they organize information technology governance into three broad groups; corporate governance, scope economies, and absorptive capacity exploring these groupings. The researchers finally relate information technology governance to the enterprise at all levels discussing results of a survey given to numerous actors in the organization's CRM [Customer Relationship Management] projects. They also found that most companies surveyed had risk and problem management programs that were mature rather than given lip service. The problem areas that stood out were communicating with senior management as well as consultants and vendors. In conclusion, the researchers remark that information technology governance depends on senior management involvement and sound project management ability (Sharma, Stone, and Ekinci, 2009).

Given scope, risk and project complexity, information technology governance offers a framework for unifying organizational objectives.  Research completed in 2009 showed that information technology governance covers all the assets that may be involved in information technology, whether human, financial, and physical, data, or intellectual property (Sharma, Stone, Ekinci, 2009, p 30).  The same research has also shown that information technology governance required top down involvement stating that successful implementations of information technology governance depends on senior management involvement, constancy, and positive project management abilities (Sharma, Stone, and Ekinci, 2009, p 43).  Senior management requires information to be shared and a 2006 project journal publication supports remarking that continuous reporting builds the case for compelling board member involvement based on four factors: cost overrun history, material expenditures, [software] complexity, and any adverse effects on the company (Oliver and Walker, 2006, pp 50-58).

The body of works while much broader than sampled and demonstrates support and strength in a number of areas of the problem statement.  The literature selected ranges in date from 1997 to 2010 with the greater portion of the works were more recent, 2007 or thereafter. Some of the areas of work are dated or sparse. This indicates a need additional research such as in the area of problem solving abilities in vague or unclear circumstances.  While much of the research was across several industries principally from industry and trade journals in information technology, general construction, or engineering the project management principles and findings transferrable between project types. The works were also with several academic studies and only two open source articles.  Most of the works were authoritative under peer review. The dated works were cited more frequently than the more current works as to be expected.

The compelling thread line in the body of works is that scope and risk concerns influenced by project complexity with cooperation, information sharing, conflict resolutions, and competencies as significant factors in project success.

Discussion

Technology projects are challenged with a variety of factors that contribute towards the performance of the project. The body of works indicates that risk and scope complicated by project complexity directly influence project success from the outset. Thus, early project planning is crucial toward success. The body of works relating to the elemental aspects of competencies, information, cooperation, and conflict management offers historical support to risk and scope formulation. The one point that seemed to standout is information sharing and flow at all levels.  Additional research is necessary into the body of knowledge behind successful project managers and the relationship to the ability to reason through complex and obscure project problem sets as related to project related competencies. Dated literature indicates a relationship between the positive locust of control and willingness to engage abstract problems.

Commentary: I suggest that compartmentalizing a complex project into smaller projects should strengthen the locust of control and improve problem solving challenges. In short, the smaller problem set is more easily grasp than an overwhelming large set of problems. Thus, reducing risk and strengthening scope definition.  In breaking a complex project into smaller achievable projects, the organization will gain greater control over the entire process and gain incremental successes towards the ultimate goal. Continuous improvement would characterize such an evolution.  The master project manager must assess the order in which the smaller projects are completed. Some may be completed simultaneously while others may be completed sequentially. 

A risk of scope creep may be introduced as an outcome of mitigating scope gap. To remain focused all the projects must align with the organizational strategic objectives as they take strategy-to-task. New ideas need to be vetted in meaningful ways for the organization and aligned with the overall objectives in a comprehensive change management plan. 


Communication is also essential in managing complex projects. The use of a Wiki as a point of  foundational policies and information is often a best practice. 

Large scale sudden disruptions of an organization are required under certain circumstances. However, in most circumstances complex projects need to be properly broken into smaller manageable efforts then become part of a continuous improvement effort within the organization. 

References

(2004). Skills shortage behind project failures. Manager: British Journal of Administrative Management, (39), 7. Retrieved from Business Source Complete database.

(2008). AICPA's IT competency tool takes you down the path to success!. CPA Technology Advisor, 18(6), 60. Retrieved from Business Source Complete database.

Brooke, K., & Litwin, G. (1997). Mobilizing the partnering process. Journal of Management in Engineering, 13(4), 42. Retrieved from Business Source Complete database.

Chua, A. (2009). Exhuming it projects from their graves: an analysis of eight failure cases and their risk factors. Journal of Computer Information Systems, 49(3), 31-39. Retrieved from Business Source Complete database.

Ferguson, E. (1999). A facet and factor analysis of typical intellectual engagement (tie): associations with locus of control and the five factor model of personality. Social Behavior & Personality: An International Journal, 27(6), 545. Retrieved from SocINDEX with Full Text database.

Bell, G.R. & Back, E.W. (2008). Critical Activities in the Front-End Planning Process. Journal of Management in Engineering, 24(2), 66-74. doi:10.1061/(ASCE)0742-597X(2008)24:2(66).

Iacovoc, C., & Dexter, A. (2005). Surviving it project cancellations. Communications of the ACM, 48(4), 83-86. Retrieved from Business Source Complete database.

Kesner, R. (2008). Business school undergraduate information management competencies: a study of employer expectations and associated curricular recommendations. Communications of AIS, 2008(23), 633-654. Retrieved from Business Source Complete database.

Kutsch, E., & Hall, M. (2009). The rational choice of not applying project risk management in information technology projects. Project Management Journal, 40(3), 72-81. doi:10.1002/pmj.20112.

Mitchell, V., & Nault, B. (2007). Cooperative planning, uncertainty, and managerial control in concurrent design. Management Science, 53(3), 375-389. Retrieved from Business Source Complete database.

Murray, J. (2001). Recognizing the responsibility of a failed information technology project as a shared failure. Information Systems Management, 18(2), 25. Retrieved from Business Source Complete database.

Natovich, J. (2003). Vendor related risks in it development: a chronology of an outsourced project failure. Technology Analysis & Strategic Management, 15(4), 409-419. Retrieved from Business Source Complete database.

Oliver, G., & Walker, R. (2006). Reporting on software development projects to senior managers and the board. Abacus, 42(1), 43-65. doi:10.1111/j.1467-6281.2006.00188.x.

Seyedhoseini, S., Noori, S., & Hatefi, M. (2009). An integrated methodology for assessment and selection of the project risk response actions. Risk Analysis: An International Journal, 29(5), 752-763.
doi:10.1111/j.1539-6924.2008.01187.x.

Sharma, D., Stone, M., & Ekinci, Y. (2009). IT governance and project management: A qualitative study. Journal of Database Marketing and Customer Strategy Management, 16(1), 29-50. doi:10.1057/dbm.2009.6.

Skilton, P., & Dooley, K. (2010). The effects of repeat collaboration on creative abrasion. Academy of Management Review, 35(1), 118-134. Retrieved from Business Source Complete database.

Sutcliffe, N., Chan, S., & Nakayama, M. (2005). A competency based MSIS curriculum. Journal of Information Systems Education, 16(3), 301-310. Retrieved from Business Source Complete database.

Vermeulen, F., & Barkema, H. (2002). Pace, rhythm, and scope: process dependence in building a profitable multinational corporation. Strategic Management Journal, 23(7), 637. doi:10.1002/smj.243.

Saturday, November 20, 2010

Innovation Shatters Paradigms

Several years ago, ATT sought to leverage technology in the global networks heralding the move as the most advanced network in the world. Their references to nodes, topologies, and ATT’s most technologically advanced network were reminiscent of traditional networking approaches not an advanced technology or methodologies. While clearly a marketing campaign, this is myopic serving only established markets with known demand. ATT’s efforts are nothing more than a tactical grab for global market share stemming from a strategic plan to position for a perceived future marketplace. A true pioneer would define the marketplace instead of positioning themselves as a jackal ready for prey. As a jackal, all you get is just what ever seems to come along then battle for morsels. A slight paradigm shift in thinking could propel information processing into realms far beyond any of the current thinking.

One paradigm that needs to be shattered is the idea of information sharing. Now, you think what is this guy talking about? The X Files, a popular television show, has a byline, The truth is out there, that applies well to this discussion. The current information sharing assumption is that someone out there is processing needed information. All we need to do is find that information. Unfortunate for both parties that neither knows of the other, let alone what information is needed and how best to exchange that information between both sides once the need is discovered. We think we can just email it around as a CSV file or a word processing document. These formats require a level of technical skill and time to digest the data. In general, somehow computers, networks, and software are the mystic medium that makes the connection to the other side. However, the connection is not so mystic. Instead, the connection should be managed right down to the desk or computational device! 

Complex Adaptive Systems may offer a solution. Using the idea of the node, complexity of network connections, and Just In Time (JIT) manufacturing concepts information can be processed, advertised, and disseminated through dynamic networks globally. Nodes, instead of being peripherals, could become JIT U-Shaped information processing cells where inputs and outputs are managed. These inputs and outputs are globally accessible using the existing telecommunication networks. The producer node advertises its products and services then dynamically connects to an information consumer who has a need. 

The beauty in such a system is the economy that emerges. Consumer nodes purchase raw data and process final products becoming a producer node that are advertised over the networks at a price. This kind of thinking could drive numerous new markets to include truly virtual companies. If a true virtual company emerges then this may cause the digital profit model to gain independence from the brick and mortal anchor to which it is tethered. There are so many possibilities but industry needs independent creative thinkers who can shape the market rather than lie in wait in the weed patches of wrecked economies hoping and waiting for opportunity.

References:

Adrian Slywotzky, September 2003, The Art of Profitability, Grand Central Publishing, ISBN:9780446692274

Englander, I. (2003). The Architecture of Computer Hardware and Systems Software: An information Technology Approach. (3rd ed.). New York: John Wiley & Sons Inc.

Monday, November 15, 2010

Spiritual Machines Create Challenges for Project Managers

What I am going to talk about originated as a discussion from my Masters in Information Technology  program. This  may seem far fetched to many people but is an upcoming debate in the not-so-distant future. Holographic technologies have the potential to cause moral delimnas for project managers who must implement these systems when they arrive. The early technology will be inaminate and mechanical in nature. As time passes this technology will combine with neural nets and biological computing to create life-like machines that could potentially develop self-awareness.  It is never too early to debate the questions and challenges these systems pose.

Holography was commercially exploited as early as the 1960’s with the GAF viewfinder. As a young boy, I recall placing reels with images into a stereographic view finder looking at the comic book world of Snoopy and other stories of dinosaurs. Later, I explored holography deeper in technical books learning about how data is encoded in the collision patterns between reference and data beams. Science philosophy books explored the holographic universe and how the human eye-brain organ is a holographic system that interprets our world.

Scientists have struggled with the eye-brain to mind dilemma in humans. The brain is the mechanical operation while the mind is spiritual in character. Holographic systems store information in terms of ghostly images unlike conventional storage systems that store information in terms of attributes. According to Michael Talbot’s book “The Holographic Universe” holography’s ethereal images reflect the way the human mind processes reality. The human brain can suffer trauma loosing large areas of tissue but somehow retains unfettered memories and even character. Likewise, a curious quality of holography is that all the information is stored ubiquitously throughout the storage medium defeating divisibility short of catastrophic loss. Any divisible piece contains the complete information set. (Talbot, 1991) Thus, holography has the appearance of retaining the character or essence of the information stored despite failures and imperfections of where the data is embodied.

Current robotic research is developing systems that mimic human sensory and motor capabilities. Software and processing hardware emulates or mimics human neural circuitry to cause human-like actions including those emotional or to make human-like decisions. Both actions are mechanical in character operating based on local action. For example, tracking and catching a baseball in flight or if the baseball hits the robot instead to perform specific emotional responses. The elements of surprise and creativity are more or less spiritual in character and have not yet been mastered by science since they are not local actions that science deals with.  For example, reflecting on the flight of the baseball and describing it as screaming through the air is creative and not a local actions. In fact, self-awareness maybe a requirement to achieve surprise and creativity.

Holography's creates theological concerns since its resilient retention of information is not mechanical. Instead, holographic data storage is based on waveforms or electromagnetic energy patterns also known as light waves. These are often equated to spirituality. There are theological implications for example from the Judeo-Christian Bible makes parallels between light and the absence of light to spiritual existence. For example, in the Bible, Genesis 1.4; "God saw that the light was good, and he separated the light from the darkness.” Holographic ghostly images in storage and computational processing could depart silicon wafers and mechanical storage systems for the amino acids and proteins found in biological processing. Human tinkering could result in challenges by truly spiritual machines. If not careful these biological machines could develop a conscience and become annoyed with natural biological computers also known as humans. In the end, mankind’s technological conduct could potentially manufacture a nemesis. If for all the good in the world there is evil then the human responsibility is to dispense the good and forsake the evil. Holographic storage is the beginning of a computational era that has the potential to elevate or degrade mankind.

"The development of every new technology raises questions that we, as individuals and as a society, need to address. Will this technology help me become the person that I want to be? Or that I should be? How will it affect the principle of treating everyone in our society fairly? Does the technology help our community advance our shared values?" (Shanks, 2005).

The possibility of computational systems not based on silicone but amino acids and proteins, the building blocks of life, is clearly on the horizon and presents some puzzling questions. As these systems advance, project managers implementing these new systems could be faced with significant ethical and moral decisions. Literally, actions such as killing the 'power' on a living machine raises questions about life and the right to exist.  Will man-made biological computers perhaps through genetic engineering develop self-awareness, spirituality, and a moral code of their own? How far will this go? What other moral and ethical issues could arise from the advent of this technology?

Please feel free to comment. I would enjoy hearing from you.

References:

Lewis, C.S., August 2002. The Four Loves, Houghton Mifflin Harcourt, ISBN: 9780156329309

Englander, I. (2003). The Architecture of Computer Hardware and Systems Software: An information Technology Approach. (3rd ed.). New York: John Wiley & Sons Inc.

Kurzweil, Ray, 1999. “The Age of Spiritual Machines: When Computers Exceed Human Intelligence”, Penguin Books, ISBN: 97801402822023

Shanks, Tom, 2005. Machines and Man: Ethics and Robotics in the 21st Century, The Tech Museum of Innovation Website. Retrieved 21FEB09 from
http://www.thetech.org/exhibits/online/robotics/ethics/index.html

Talbot, Michael, January 1991. The Holographic Universe, Harper Collins Publishers, ISBN 9780060922580

Sunday, November 14, 2010

Social Media API'S, Architecting and Building Applications

This is a soft technical discussion of coding for social media. Some technical knowledge is helpful.

Social media has taken off and is here to stay. Preparing contextual material for the social media channels has been discussed in tremendous detail with innumerable books highlighting this aspect. However, creating new capabilities and leveraging social media in newer ways often requires creative use of the technology. This almost always translates to creating a new application. Most companies would react negatively to the thought of creating a new application having had poor experiences in the past. With a solid plan and incremental or evolutionary development and patience the process can be much more palatable. This begins with the understanding of the technology.

Twitter, Google, LinkedIn, Facebook as well as other social media instruments typically have a way to connect to their service. This connection is known as an Application Programming Interface, an API.  Through this interface, application specific data is passed bi-directionally in most cases through a standard variable that  passes the data.  Most instruments prefer that the application does not embed into their site. That means they want the application to run on a separate server somewhere else on the planet. Site information is passed to the remote server via the API regardless of geographical location. What this means for the business seeking to leverage social media is that they can have complete control of their application and connect it to multiple social media instruments as long as they meet the connection agreements.

Both the social media instrument and the remote application have API's that must match up. With multiple instruments in use, the connection is not always clean. Therefore, the remote application should make use of a Gateway for each instrument connected. This Gateway maps the instrument specific API's to your application specific API's.  

Most API's require authentication. Some social media instruments allow sending a simple username and password with each request. More advanced methods are now in use by most social media instruments. Twitter  uses an OAuth method which is similar to a valet key that limits access. Facebook uses a handshake approach and has two elements to authentication; a connection and the state of being logged in. The Facebook authentication is the most complex aspect of interfacing. The differences in authentication between social media instruments is also another reason to use a Gateway specific for that instrument. 

By properly architecting social media interfaces, companies can combine social media in unique ways to create niche markets, skirt around fierce competition, or otherwise reach their audience in meaningful ways. Project managers running these projects should look to combine Software Development Lifecycle (SDLC) and spiral waterfall models to achieve incremental progress.  Obviously, the project manager would want to prioritize connections and seek the greatest returns early. 

Friday, November 12, 2010

You Gotta Talk! And Talk A Lot...

I want to make this posting to discuss an emergent situation I am observing. But first, your social media experience may follow a pattern of the Social Media Maturity Life Cycle Model. My experience certainly did. When these instruments became known to me I was still in a mindset of the good ole job search method of mailing resu-letters as Jeffery Fox teaches in his book How to Land a Dream Job  (circa 2001) with the slight twist of emailing them.  When I finally began to build my social network, I relied on it too much and became discouraged but kept building and networking. Slowly, I became enlightened mostly through some hard knocks and currently my social media networks are becoming more active.

Let me regress a little bit to discuss some background. Over the past decades we have observed the advent of video games and other technology based activities that essentially require solitary and/or faceless, nameless interactions. People could literally say anything and discuss anything behind an avatar and screen name. Social skills were literally tossed out the window as condemnations, insults, and putdowns dominated a majority of the dialogues online. However, there is a resurgence of civility especially where professionals are concerned.

A growing body of works illustrates a movement away from technology towards increased social interactions. The authors of the book Brains On Fire (2010) remark that technology is a trap. A crutch. They argue that it is a detriment. Other works stemming from Orville Pierson (2005), Dale Carnegie, (1938), Stephen Covey (1989), and others stress that you got to get out and talk to people. As the Internet grew in popularity The Clue Train Manifesto (1999) declared to the people of Earth that markets are conversations in which you got to inspect the goods and ask questions. In other words, you got to talk to people.

While LinkedIn, Facebook, and other social media instruments are useful in one's job search, nothing is a substitute for good ole getting out there and talking to people and having people talk about you. Your social media networks are not truly useful as long as they remain in cyberspace. Plenty of movies like 'Lawnmower Man' and 'Tron' look to put one's essence into cyberspace but you need to go the other direction to get out of cyberspace. Only one movie, I can think of, Weird Science actually has cyberspace entering real space when a bunch of young boys feed data into their computer. They then have a seance and through a freak of nature a cyberspace being appears stunning the boys. Making that leap from building a successful social media network in cyberspace to talking to people is a challenge for many people. For some it can be a weird science. You've been putting connection data into the system, now is the time to bring it to life. How does one start the conversation, sustain the conversation, or even get difficult people involved? 

This can be an expansive discussion that delves deeply in one's psyche and considerable study into charismatic methods and psychology. However, there is a simple approach. You do not need a seance and  a weird science! You need the phone, your conversational skills, and some guts. CALL!  If you are uneasy about calling, practice in front of mirror, smile. Script lines if it helps but do not read them on the phone. People will know you are simply reciting lines. Nonetheless, CALL!

There are some simple rules for calling.
  1. Take notes.
  2. Call at a convenient time. When they answer ask if it is a good time to call. If they are busy schedule a new time to talk when convenient for them. 
  3. Disarm them. Tell them you are not calling for a job. Say that you are just touching base.
  4. Make small talk. Discuss common points relating to your relationship with them. If you lack knowledge on something they are passionate about read an article about it before calling.
  5. Keep it brief.
  6. Put em on a tickler to call every three to four weeks.
  7. In three to four weeks, CALL them again. 
Of course, you will need to assess each connection you contact and treat them based on strength of your relationship with them. If you've had an interview, do the interview followup process I discussed in my earlier blog postings;
The bottom line is you got to get off your duff and get out there. You need to meet and talk to people. The more you do this, the better you get at this, and the greater your opportunities!

Commentary:  I do not want to confuse people with my "Become a Good Conversationalist" post. In that post, you got to shut up and practice listening, especially during an interview. In this post, you need to get your message out and talk to many people. 

Tuesday, November 9, 2010

Operational Risk Management Military Style versus Industry Approach

Operational Risk Management

Military risk decision making is often time-critical and made by immediate leadership as the risk conditions emerge. The level of risk planning in the military tends to be rapid and subjective. Operational Risk Management (ORM) attempts to organize a more deliberate process. ORM is intended to be a decision-making tool for leadership at all levels in order to proportion the risk against the mission objectives. The goal is to minimize risk and maximize objectives. The ORM assessment model cycles though a five step process that identifies, assesses, decides, implements, and then monitors the risk hazards. During the assess stage qualitative values are assigned to risk hazards using the appropriate supporting risk matrix of acceptable risks. Risk assessment codes assigned may range from critical to negligible or may reflect likelihood of occurrence. Once the level of risk is assign the decision maker seeks means to reduce most probable high risk as well as increase the benefit or outcomes. This is more or less a subjective gut feel. Controls are implemented and then monitored to ensure that the risk does not begin to increase. If the risk begins to increase the decision-maker may recycle the process to minimize the emergent risk (Naval Safety Center, 2009).

In contrast, risk planning in project management utilizes a breadth of techniques that avoid, transfer, mitigate, and/or accept negative risk or exploit, share, and enhance positive risk factors (Heldman, 2009, pp 264-267). Project risk management utilizes risk matrices to assess the level of risk (Heldman, 2009, pp 250-254). The risk matrices are a common point. In practice, ORM is adjusted to personal style and most often is executed as a talk through process in a huddle or meeting. Whereas, project risk management is more formal and utilizes argumentation structures to support risk decisions. Managers not trained in argumentation may use argumentation unknowingly and sort of wing it to some extent. Understanding this process with supporting reasons will improve project decision making.

The critical factors of success under ORM is to increase awareness of the level of risk involved in order to lower injuries, deaths, mishaps, and property damage as negative effects of risk. As positive outcome of risk management undertaken is unnecessary risk is avoided and decisions are made at the appropriate level. The greatest critical factor of success is to conserve assets to be applied at the decisive time and place (Naval Safety Center, 2009). In industry, conserving assets is often thought of as timing expenditures or allocating resources optimally. This is one point of having a Project Management Office (PMO) but can be managed within the operations and project realms as well. 

References:

Heldman, K. (2009). PMP: project management professional exam study guide. (5 ed.). Indianapolis, IN: Wiley Publishing Company.

Naval Safety Center. (2009). Operational Risk Management. Retrieved from
http://safetycenter.navy.mil/

Friday, November 5, 2010

Caterpillar Leverages Information Technologies for Sustainable Growth

Business is warfare based principally on sage utilization of information which is a key factor determining success in business. Caterpillar has long recognized that access to accurate information in order to build actionable knowledge is critical to business success. Caterpillar is a complex global enterprise operation based out of Peoria, Illinois that through well tuned information management is achieving incredible success. Sales revenues during 2007 exceeded forty four billion dollars. (Caterpillar, 2007, Annual Rpt p 33) Enterprise growth goals by 2010 are projected to exceed fifty billion dollars. (Caterpillar, 2007, Annual Rpt p 27) This expansion of the revenues is coming with solid vision and sage business design. Caterpillar’s vision centers on sustainable development utilizing a strategy of innovation and technologies in support of the company’s objectives. (Caterpillar, 2007, Shape Rpt p 36). This means information and the requisite systems are principle to analysis, rapidity of decision making, and identification of actionable business opportunities.

Intellectual Capital Drives Innovation

Many professionals in business incorrectly believe intellectual capital, IC, is simply good ideas that become proprietary because of the station at which the idea was imagined. As an outcome, these professionals believe a company has a legal claim to a good idea. The reality is that good ideas are abundant as nearly everyone has a good idea but most lack the means to put the good idea into effect.

Intellectual capital is better thought of as knowledge that can be converted into commercial value and a competitive advantage resulting in intellectual assets of the company. The conversion of knowledge into commercial value requires a means to codify the knowledge into an intellectual asset. In order to achieve this companies provide structural capital in support of the human capital to gain access to intellectual assets. Thus, IC results from human and structural capital operating in an unique relationship forming intellectual assets. Companies distinguish their operations from the competition by combining knowledge and the infrastructure in differing ways. The process of converting knowledge into intellectual assets results in the innovation that companies seek to commercialize (Sullivan, 1998, p23).

According to the book The Innovator’s Solution by Clayton Christensen innovation in business means growth resulting from the introduction of something new that can be exploited for commercial value. Christian further explains that sustainment growth focuses on delivering new and improved benefits to high-end customers. He then comments that companies are more interested in disruptive growth which results in reduce cost, simplicity, and new directions. Introducing something new is often thought of as unpredictable which is not desirable to most companies. Christensen believes the key to innovative success is not predicting human conduct as rarely does innovation come from a single human fully developed. Instead, He comments that companies must understand the forces that act on humans. What this means is that when innovation is managed through design there is predictability then companies are more readily apt to embrace the change.

In the classic understanding of design, there are three characteristic aspects; the visceral or how the design looks, behavioral relating to the designs functionality, and reflective qualities that provoke thought. In classic design beauty is also found. Good designs demonstrate beauty through harmony and fluid execution. As companies increase in size and complexity the problem of accessing knowledge becomes exponentially difficult. Communicating messages between the top intent and bottom action can become confused and misdirected if not properly managed. Thus, a reliance on finely tuned information technologies becomes an imperative.

Caterpillar has exercised deliberate efforts to employ information technologies that demonstrate good design. For example, a visual imaging company, Real D-3D, posted a company website an article regarding Caterpillars’s need to speed engineering projects to market by employing visualization technology in a project called “CrystalEyes”. According to this article a key feature of the CrystalEyes project was to make the information tool simple to use for engineers and clients alike that eliminated prototyping iterations as well as the tool also had to be cost effective, cross platform, and easily integrated with existing systems. These requirements demonstrated behavioral qualities of a good design. Real D-3D described “CrystalEyes” as a stereographic imaging tool that is an improvement beyond the ghostly holographic effects that met all the design criteria. They were describing for example, designs that can simulate in 3-D the full effect of parallax and other phenomenon related to stereoscopic imaging. Thus, “CrystalEyes” illustrated the visceral elements of a good design. The benefit CrystalEyes delivered was a high performance design visualization tool that eliminated physical builds until the very end. (Copy Editors, Real D-3D) Using the CrystalEyes tool afforded clients and engineers alike the ability to fully understand a design in work provoking thought or the reflective qualities of good system design throughout the engineering iterations.

Management Information Systems Build Decision Support SubSystems

Management information systems, MIS, are complex. These systems come in a variety of technologies and capabilities. One size does not fit all operations. In general MIS involves, at least, three elements; a network or hardware lay down, supported management concepts, then integrated decision analysis and reporting. Through the combinations of these elements companies are able to leverage themselves in competitive ways and provide the infrastructure for innovation.

Caterpillar leads the industry with decision support subsystems. Data is infused into the creation of products and services in support of growth that is collected from significant customer segments and Caterpillar’s geographically dispersed operations. The systems span over two hundred independent dealers globally and their proprietary networks. Caterpillar’s efforts include numerous projects and software tools that fuse these systems together and include but are not limited to:
  • VIMS: Vital Information Management System is a vehicle borne alert system that assesses the equipment’s safe and optimal operating condition. When a problem begins to emerge or is discovered the system alerts the operators and owners then provides safe shut down procedures if necessary. This enhances the service life of the equipment and is an decision support subsystem.
  • Product Link: A wireless system that simplifies the work of tracking the fleet providing assets management information. Product link couples with VIMS.
  • Paperless Reporting: A wireless project that integrates Dealer Business systems and Service Technician’s Workbench with field service technicians reducing errors and streamlining data entry requirements.
  • EquipmentManager: Software designed to report fleet performance and manage assets. This application is the owner’s frontend that presents the VIMS and product Link performance information on demand in meaningful ways.
  • VIMS Supervisor: Vital Information Management System Supervisor Software provides custom fleet production and maintenance reports by extracting data from a VIMS Database.
  • Caterpillars authoring system: A system that is both an information consumer and producer organized to streamline global technical publication operations.
The VIMs, Product Link, Paperless reporting, and the authoring projects are of particular interest as they are subsystems that impact a sequence of other systems ultimately feeding up to top level decision support systems.

Product Link Pools Global Equipment Performance Information

Caterpillar introduced a subsystem called “Product Link” that leverages equipment performance information collected by VIMS towards decision support. “Product Link” is a management tool that tracks and gathers information about Caterpillar’s earthmoving equipment. An online HUB Magazine article written by Caterpillar’s Information Centre discussed the subsystem as composed of two antennas, a data module, and interconnecting wiring. They explain that one antenna collects GPS data while the other antenna provides bidirectional communication with the network operations center. The data module referees the collection of performance and GPS data as well as instructions from the network operations center. Information collected is transmitted to a Caterpillar network operations center wirelessly through low Earth orbit, LEO, satellites. At the network operations center the information is further evaluated then reports are prepared and sent to the equipment owner. Equipment owners are able to access the information over the Internet using the “Equipment Manager” software.

The benefit to both parties is essential to asset management with improved service life of the equipment, reduced down time, and strengthened return on the investment according to Caterpillar. These have been principle reasons the customer purchases Caterpillar equipment. Therefore, understanding the equipment utilization, location, and performance data helps Caterpillar design heartier equipment meeting equipment owner expectations.

This subsystem has seamless operation with the equipment reporting to the Network Operations Centre where the data is collated and eventually is rolled up to into top level decision making support systems demonstrating beauty in the design’s fluidity. The information provided to the owner through “EquipmentManager” answers concerns about utilization, security, and uptime according to Caterpillar further illustrating functionality and reflective utilization of the design.

Paperless Reporting Links Field Service Technicians Into Global Systems

A case study was researched and published in Directions Magazine by Mike DeMuro, Product Support Manager for Michigan Caterpillar, regarding a Michigan Caterpillar’s paperless project initiative. According to the article Michigan Caterpillar field service technicians were experiencing time consuming and error prone process in their dispatch system reporting. Technicians were using an antiquated process of paper forms that were transcribed onto the system in the classic data entry manner. In some cases, information was passed verbally and transcribed days later. Often the information was incomplete or erroneous. Caterpillar sought to streamline the process. A statewide centralized dispatch system was in order to form a mobile office assesses DeMuro.

DeMuro explains that the design of the system utilized an enterprise data integration service that offered both cellular and satellite coverage. Caterpillar’s Dealer Business System and Service Technician’s Workbench was integrated into the enterprise data integration service and Microsoft Outlook. After data was entered once into the system, technicians could drop and drag data into Outlook templates and distribute the data without error prone re-typing. The emails were received by servers and scripts parsed the data into the other systems further reducing errors and increasing productivity. This created a paperless culture of online forms that transmitted data wirelessly between service vehicles equipped with the system and staff functions. DeMuro further claims the benefit of this innovative approach radically improved billing cycles, accuracy, and timeliness of data reporting. Other first order benefits lead to reduce overhead for data re-entry, increase productivity and revenue generating hours, timely parts delivery, and seamless integration of systems. This resulted in secondary effects of improving cash flows and accounting for receivables explains DeMuro.

Again Caterpillar was able to achieve beauty in its seamless design for field service technician reporting. The error rates were subject to initial data entry and additional entry was eliminated leading to very productive functionality of design. The data gathered is cascaded through to higher level systems for further evaluation.

Technical Authoring System Forms Intellectual Assets

Caterpillar was experiencing problems with the technical publications accuracy, timeliness, and availability. There were over 300 products with some having lifecycles as lengthy as 50 years. Compounding this immense data requirement was operations in 35 languages. Therefore, in the late 1990’s Caterpillar envisioned a need for a better method of managing this labor intensive effort of technical documentation. They pursued innovation by taking advantage of emerging Standard Generalized Markup Language, SGML, standards that overcame the limitations of the existing methods. The introduction of the new approach delivered levels of efficiency based on reuse and automation that had never been observed.

Caterpillar began by creating a Technical Information Division, TID, that had the global responsibility of producing the documentation necessary to support operations. They expanded the technical documentation staffing by 200% then organized the automated publishing system, the structural capital, which enabled the staff’s effectiveness to deliver the technical documentation or intellectual assets. These assets included maintenance manuals, operations and troubleshooting guides, assembly and disassembly manuals, specification manuals, testing and special instructions, adjustment guides, and system operation bulletins.

In the design of the authoring system, Caterpillar took a modular approach to information creation and automated where possible. The system designers built on top of industry standards and even utilize MIL-PRF-28001 for page composition. They utilized reusable ‘information elements’ that are capable of being utilized in multiple formats and forms. This approach drastically reduced cost associated with creation, review, revising, and translating information. Through automation of a document formation and information elements, Caterpillar was able to achieve collaborative authoring that trimmed time-to-market and permitted increased focus by subject matter experts that strengthen the quality of the product. The efficiencies achieved staggering improvements in work flow and analysis, document development, style sheet designs, and legacy conversions. In the end, Caterpillar experienced accuracy, timeliness, and availability of technical information that became of immense commercial value and competitive advantage.

Caterpillar’s copyrighted technical documentation is of such immense value that criminal elements have attempted to exploit this information. In May 2002 Caterpillar’s digital library of parts and product catalogues, service manuals, schematics, tooling data, and product bulletins was compromised. U.S. customs reported that they had seized a half million dollars in counterfeit Caterpillar technical documents. This criminal activity demonstrates that the value of well designed intellectual assets can be of significant value as well as vulnerable.

Data Warehousing Efforts Consolidate Enterprise Data

Designing solid data management methods are critical to business success. MIS approaches decision making generally from the process such as a purchase order process whereas decision support systems tend to focus on conduct and behavioral characteristics such as fuel consumption trends. This requires data gathered to be stored, parsed, and analyzed in ways that support strategic decision making over operational management of the operations. The outcome of a well designed data warehousing system is equipment managers shift their focus from operational level decision making to corporate level strategic decision making regarding asset management.

Data marts are working subsets of larger primary database systems used to present unique views on subject matter topics. These data marts are then organized in a way to permit multi-dimensional modeling of the operations. This multi-dimensional model is called the data cube. Online Transaction Processing, OLTP, and Online Analytic Processing, OLAP, usher data routinely into the data cubes and conduct ongoing analytic evaluation of the data in support of on demand or real time review. These tools have also been advanced over the Internet permitting decision support system authorized users to conduct the analysis they are seeking.
The benefits of data warehousing involve better end user control of the data analysis, improved tooling for identification and investigation into problems, strengthened strategic decision making, and improved knowledge discovery. Data warehousing is the foundation of computer aided construction equipment and asset management.

Caterpillar has sought a global data solution and chose TeraData Inc as its business partner in March 0f 2008. TeraData business decision support solution is comprised of component products built on top of the “Active Data Warehouse” product. The component products provide intelligence, analytics, and other support services to decision making.

The Active Data Warehouse product is the underpinning of their services and refers to the technical aspects required to achieve the desired objectives of the data warehouse. This database is designed to receive the feeds from mature MIS subsystems such as Caterpillar’s VIMs, the paperless reporting, and Authoring subsystems. This results in a repository of data that possesses high confidence of data accuracy. The database can be utilized in ordinary MIS support to ecommerce, portals, and other web applications but has greater impact when coupled with decision support applications. With the confidence in the data accuracy, complex reporting and data mining that can be generated on tactical or short notice queries in near real time makes this solution a powerful tool. This capability originates from TeraData’s strategy built on the findings of a 2001 Gartner report that data marts cost 70% more per subject area than a comparable traditional data warehouse. (Sweeney, 2007). TeraData seeks to consolidate data marts, reduce redundancy, and streamline the data loading process into a centralized analytic source in effect creating a massive sole source data mart equivalent to the enterprise wide data set. This streamlining is consistent with Caterpillar’s desires to innovate through technology resulting in the 2008 agreement to improve Caterpillars decision support.

Business Intelligence Products Strengthen Decision Support

TeraData’s component products include a suite of applications that utilize Caterpillar’s enterprise wide data warehouse for analytic and intelligence reporting. Tools in this suite includes strategic and operational Intelligence applications, data mining tools, modeling software, and analytical tool sets that handle extremely large datasets looking for criminal conduct as well as emerging trends. Included also in the suite are maintenance and management tools.

Bringing Information Technology Projects in Focus

Caterpillar brings together disparate systems into a symbiotic global information presence through network operation centers, communication networks, and data processing methods and systems. The elements of good design are observed throughout the systems at Caterpillar and create a culture that promotes innovation whether that is technical publication, engineering, or field management of the equipment. With this foundation in place Caterpillar began a process of increasing vertical accuracy across their systems into decision support systems. The disparate enterprise data is rolled up into the decision support systems data warehouse and requisite set of tooling establishing a formidable competitive instrument. Agreements with TeraData in early 2008 lead to solutions to implement near real time reporting with increased accuracy. As an outcome, Caterpillar has propelled to the forefront of heavy equipment manufacturers to become the industry leader with growth projections that eclipse the competitors. Nonetheless, Caterpillar is restless. Becoming number one in the industry is simply not enough for this giant.

The Future is Bright

Caterpillars positioning in the industry as the leader is not the end state for this company. One concept of business is that no company makes a profit over the long term. The purpose of any business is to be a vehicle that provides income and dignity to human life. In executing this concept principles and moral responsibilities are assigned to companies and governed a cooperation between government and industry. Caterpillar has taken on the next evolution of large corporations, corporate governance. They define their vision in a sustainability report called “Shape”. The term shape is a key notion that is inclusive of the forces that forge innovation in the shaping of knowledge into business plans. Caterpillar has identified the pillars of its “Shape” initiative as:
  • Energy and Climate: Caterpillar realizes the importance of energy security and the impact energy consumption by the equipment has on the ecology.
  • Growth and Trade: Expanding economies and international business are important to sustainable operations.
  • People and Planet: Caterpillar equipment builds economies and lifts people out of poverty.
  • Further and Faster: Shape take form over time then accelerates as the vision organizes. Caterpillar must be willing to drive the vision beyond that which is currently known in order to embrace the future of sustainability.
Using caterpillars systems and technologies, the company is actively seeking and organizing a plan to reach for the moral high ground and is embracing corporate governance. Caterpillar’s equipment is known to move mountains. In time, as corporate governance takes shape Caterpillar will emerge as a social force that levels societal inequities while elevating human dignity around the globe. Humans will have jobs with disposable incomes, improved roads, hospitals, and strengthened economies built by Caterpillar’s equipment and backed by Caterpillar’s social conscience.

References:
  1. Bartlett PG, 1997, “Caterpillar Inc's New Authoring System”, SGML Conference Barcelona 1997, Retrieved October 15, 2008, http://www.infoloom.com/gcaconfs/WEB/barcelona97/bartlet8.HTM#
  2. Caterpillar Public Affairs Office, 2007, “2007 Caterpillar Annual Report”, Retrieved October 10, 2008, http://www.cat.com
  3. Caterpillar Public Affairs Office, 2007, “Shape: Sustainability Report”, Retrieved October 10, 2008, http://www.cat.com
  4. Caterpillar Public Affairs Office, 2008, “Caterpillar Logistic Services Inc Web Site”, Retrieved October 12, 2008, http://logistics.cat.com
  5. Christensen, Clayton M, (2003), “The Innovators Solution”, (1st ed), Boston Massachusetts, HBS Press
  6. Copy Editor, ”Caterpillar moves Mountains in Speeding Time-To-Market using CrystalEyes and Stereo3D Visualizations”, Real D-3D, http://reald-corporate.com/news_caterpillar.asp
  7. Copy Editor, July 2007, “New-generation Product Link system from Caterpillar improves asset utilization and reduces operating costs”, HUB, Retrieved Octover 18, 2008, http://www.hub-4.com/news/633/newgeneration-product-link-system-from-caterpillar-improves-asset-utilization-and-reduces-operating-costs
  8. DeMuro, Mike, April 2005, “Michigan CAT Case Study”, Directions Media, Retrieved October 17, 2008, http://www.directionsmag.com/article.php?article_id=823&trv=1
  9. Eckerson, Wayne W., (2007), “Best Practices in Operational BI: Converging Analytical and Operational Processes”, TDWI Best Practices Report
  10. Hongqin Fan, 2006, “Data Warehousing for the Construction Industry”, NRC
  11. Schwartz, Evan I., (2004), “Juice: Creative Fuel that drives World Class Inventors”, (1st ed), Boston Massachusetts, HBS Press
  12. Sullivan, Patrick H., (1998), “Profiting from Intellectual Capital: Extracting value from Innovation”, (1st ed), New York, John Wiley & Sons, Inc.
  13. Sweeney, Robert J., (2007), “Case Study: TeraData Data Mart Consolidation ROI”, TeraData Corp.

Thursday, November 4, 2010

Global Outsourcing Open Short Term Windows of Opportunity

Global outsourcing will continue to expand and shift to more attractive markets based on lower labor cost and the emergence of disruptive technologies that may make global outsourcing more enticing. Under these circumstances, companies should anticipate brief windows of opportunity before the technology and the labor markets shift making the original outsourcing decisions obsolete. Companies should design for long term strategic advantage that relies less on emergent technologies and low labor cost.

The article Deriving Mutual Benefits from Offshore Outsourcing discussed the 24 hour knowledge factory that leverages temporal and geographic dispersement of labor to the benefit of businesses. The motivations for such a paradigm will emerge out of a shift from short term economic drivers to the knowledge factory in the long term. These researchers studied numerous situations of varying degree between the art and science of knowledge production. They concluded that the 24 hour knowledge factory concept could produce new products and services with a swifter time-to-market, strategic and economic advantages, and will be used by increasing industries (Gupta, 2009).  The knowledge factory concept is an interesting concept as long as the knowledge is actionable or can be exploited for business advantage; reduced costs or increases in revenue. Other business advantage may include strategies of reductions in organizational latency, economies of scale, or increases in market share. 

This article Innovating or Doing as Told? Status Differences and Overlapping Boundaries in Offshore Collaboration, the researchers sought to understand the relationship between cultural, organizational, and functional boundaries to distributed teams. The study explored a variety of collaborative conditions and circumstances including the influential factors. The researchers also studied decision making processes concluding that many factors affected the outcomes. They offered four generalized facts for selecting a geographic location. The researchers conclude after their study that effective collaboration relies on managing good people effectively. The key challenge is to develop IT personnel to be effective in boundary spanning roles and to support them with proper authority and resources (Levina and Vaast, 2008). Exploitation of technology is a strategy in business that is often short lived yielding only short term gains before technological obsolesce sets in. The stronger approach is to understand large scale inter-boundary processes and seek to optimize these processes.

The article Innovative Organization: Creating Value Through Outsourcing offered framework for outsourcing decisions. The author recognized that there are hidden costs to outsourcing that impact the bottom line. He explores outsourcing problems before presenting the innovative outsourcing framework that involves reasons, timing, contract, accountability, lifecycle, and selection as factors in decision success (Tadelis, 2007).  Companies should develop a framework that is inclusive of all the factors affecting outsourcing is essential to sustainability.

The three articles reviewed present a clear picture that globalization and outsourcing are have gained critical mass and will continue to grow in the coming years. They present challenges, solutions, and opportunities to outsourcing. One common thread was complexity involved in multi-national outsourcing. This complexity arises out of numerous boundary factors to include cultures, legal, timeliness, and structures to name a few.  In terms of project management, the body of works indicate that the complexity of outsourcing requires change management specifically in terms of leadership, authority, information technology staff that is skilled in the challenges of outsourcing environments, and the proper resources to cope with the outsourcing challenges. As an organization implements outsourcing, change management must transition the organization from an internalized organic operation to a dispersed and decentralized operation across many boundaries. 

The article Deriving Mutual Benefits from Offshore Outsourcing presented an interesting notion of knowledge factories that create new opportunities for human employment and strategic advantage to companies. This may be a short lived benefit due to the exponential swiftness in which technology is advancing.  Computational technologies tend to be based on time profit models in which earning must be made in the short period following introduction or else the technology become obsolete very swiftly. The emergence of disruptive technologies is frequent and they could potentially nullify multi-national knowledge factories before they become entrenched causing workers to retrain in another field.  In conclusion, outsourcing will continue but not without a host of challenges. The sage approach is to weigh all the factors carefully then prioritize potential offshore locations based on strategic objectives and less on cost or technological advantage. Companies should anticipate a brief window of opportunity before technology and the markets shift making the original outsourcing decisions obsolete.

Project Management Commentary: Project managers juggling projects across international boundaries for projects that operate 24 hours a day cannot rely on tactile management methods. Instead, these project managers may discover they have to structure a framework that embraces pushing decision making down to lower levels and embraces quality management checks more frequently in decentralized geographically dispersed autonomous teams. Performance and quality incentives may need to be coupled to pay that increase earnings and promotions offering eligibility or improved consideration.  These programs may need to take into consideration local labor regulations. 

References:

Gupta, A. (2009). Deriving mutual benefits from offshore outsourcing. communications of the ACM, 52(6), 122-126. Retrieved from Business Source Complete database.

Levina, N., & Vaast, E. (2008). Innovating or doing as told? status differences and overlapping boundaries in offshore collaboration.  MIS Quarterly, 32(2), 307-332. Retrieved from Business Source Complete database.

Marchewka, J.T. (2009). Information technology project management: providing measurable organizational value. (3rd ed.). John Wiley and Sons inc: United States. 

Tadelis, S. (2007). The Innovative Organization: creating value through outsourcing. California Management Review, 50(1), 261-277. Retrieved from Business Source Complete database.