It is a sunny, windy morning. Seven hundred participants from all over the world have come to the technology park, Sanada to attend Technology for humanity and global security summit’2020. The President of Sanada has inaugurated the summit. It is an open forum; there are ten interactive brainstorming sessions; the participants are raising a set of debtable and intelligent questions on poverty, sustainable development goals, global security policy, business model innovation, economic growth and entrepreneurship. Dr. S.Chakraborty is presenting the basic overview of deep analytics. He is outlining the concept and mechanism of deep analytics to evaluate technology management in terms of seven ‘S’ elements (scope, system, structure, security, strategy, staff-resources and skill-style-support). He is also defining the significance of various parameters in the context of technology management such as technology security, technology classification, technology association, technology clustering, technology prediction or forecasting, innovation, adoption, diffusion, infusion and dominant design. The other objective of this session is to analyze the emerging concept of technology for humanity and select a set of emerging technologies for the sustainability of human civilization.
Deep analytics is an intelligent, complex, hybrid, multi-phased and multi- dimensional data analysis system [Figure 1.1]. The basic steps of computation are data sourcing, data filtering / preprocessing, data ensembling, data analysis and knowledge discovery from data. The authorized data analysts select an optimal set of input variables, features and dimensions (e.g. scope, system, structure, security, strategy, staff-resources, skill-style-support) correctly being free from malicious attacks (e.g. false data injection, shilling); input data is sourced through authenticated channels accordingly. The sourced data is filtered, preprocessed (e.g. bagging, boosting, cross validation) and ensembled. It is rational to adopt an optimal mix of quantitative (e.g. regression, prediction, sequence, association, classification and clustering algorithms) and qualitative (e.g. case based reasoning, perception, process mapping, SWOT, CSF and value chain analysis) methods for multi-dimensional analysis. The analysts define intelligent training and testing strategies in terms of selection of correct soft computing tools, network architecture – no. of layers and nodes; training algorithm, learning rate, no. of training rounds, cross validation and stopping criteria. The hidden knowledge is discovered from data in terms of collective, collaborative, machine, security and business intelligence. The analysts audit fairness and correctness of computation and also reliability, consistency, rationality, transparency and accountability of the analytics.
Figure 1.1 : Deep Analytics
Deep analysis can process precisely targeted, complex and fast queries on large (e.g. petabytes and exabytes) data sets of real-time and near real-time systems. For example, deep learning is an advanced machine learning technique where artificial neural networks (e.g. CNN) can learn effectively from large amount of data like human brain learn from experience by performing a task repeatedly and gradually improves the outcome of learning. Deep analytics follows a systematic, streamlined and structured process that can extract, organize and analyze large amounts of data in a form being acceptable, useful and beneficial for an entity (e.g. individual human agent, organization or BI information system). It is basically a specific type of distributed computing across a number of server or nodes to speed up the analysis process. Generally, shallow analytics use the concept of means, standard deviation, variance, probability, proportions, pie charts, bar charts and tabs to analyze small data set. Deep analytics analyze large data sets based on the concepts of data visualization, descriptive and prescriptive statistics, predictive modeling, machine learning, multilevel modeling, data reduction, multivariate analysis, regression analysis, logistic regression analysis, text analysis and data wrangling. Deep analytics is often coupled with business intelligence applications which perform query based search on large data, analyze, extract information from data sets hosted on a complex and distributed architecture and convert that information into specialized data visualization outcome such as reports, charts and graphs. In this summit, deep analytics has been applied for technology management system (TMS).
Technological innovations are practical implementation of creative novel ideas into new products or services or processes. Innovations may be initiated in many forms from various sources such as firms, academic institutions, research laboratories, government and private enterprises and individual agents. There are different types of innovations from the perspectives of scope, strength, weakness, opportunities, threats and demands from the producers, service providers, users, service consumers and regulators.
Innovation funnel is a critical issue in technology management; innovation process is often perceived like a funnel with many potential ideas passing through the wide end of a funnel but very few become successful, profitable, economically and technically feasible products or services through the development process. Deep analytics is an intelligent method and consulting tool that is essential for effective management of top technological innovations today. It is basically an integrated framework which is a perfect combination or fit of seven dimensions. Many technological innovation projects fail due to the inability of the project managers to recognize the importance of the fit and their tendency to concentrate only on a few of these factors and ignore the others. These seven factors must be integrated, coordinated and synchronized for the diffusion of top technological innovations globally.
Deep Analytics Mechanism [DAM]
Agents: Single or a group of data analysts;
System : Technology Management System /* Technology for humanity*/
Objectives: Evaluate an emerging technology for innovation, adoption and diffusion;
Constraints: Availability of authenticated and correct data, time, effort, cost;
Input: Technical data (Dt), Business data (Db); /* Entity : An emerging technology for humanity*/
Procedure:
Payment function : Compare a set of technologies based on cost benefit analysis. Output: Technology intelligence (collective, collaborative, security, machine, business);
Deep analytics is essential to understand the nature of a technological innovation and identify the gaps between as-is and to-be capabilities in a systematic and compelling way. It reasons seven dimensions under three major categories: (a) Requirements engineering schema: scope [S1]; (b) Technology schema : system [S2], structure [S3], security [S4] and (c) Technology management schema : strategy [S5], staff-resources [S6] and skill-style-support [S7] [Figure 1.1]. This session analyzes each dimension briefly and reasons a set of cases of top technology innovations today in next sessions [2-10] applying the tool of deep analytics. The basic building blocks of our research methodology include critical reviews of existing works on technology management and case based reasoning. We have reviewed various works on technology management. We have collected the data of the cases from various technical papers and secondary sources. Session 10 concludes this summit.