Thursday 30 May 2024

Ai Driven = Ai Derived ? Not for a Da Vinci Determinate

When the hard stop of intellectual property protection is thrown by the wayside due to infinite " uniqueness " at what point does collective knowledge co-construction mean limitless innovation derivation ? It may be close to 100% thanks to Ai.

iGNITIATE : Ai Driven = Ai Derived ? Not for a Da Vinci Determinate


With an incredibly and over simplistic timeline lens of, say, just the last 4000 years ( plus or minus 2000 years on the standard Zero AD western calendar ) when we examine previous generations of tool making, design, and thus engineering capability, we see specifically, throughout the evolution and acquisition of ( say artistic, mathematical, etc ) skills the capability of specific designs and engineering functionality to be recognized and protected with a very high bar. To become a master took incredible time, patience and resilience. To create something unique and usable an even longer time. To have said efforts and outputs put into production and so to increase a users natural human capabilities even longer.

Sans digital tool creation and usage, and only focusing on physical object creation and production and specifically within the context that Ai-Driven Learning And Complex Adaptive Systems may in the future have the ability to create incredible jumps in end functionality of physical goods, one of the biggest explosions of creativity is taking place right under our noses. Driven by a factor that is uniquely overlooked ( the size of your mailing list / active social media users ) means it is  ( and not even by the end user sometimes ) the end producers that are driving what is possible for end users to enjoy and have in their lives. Think something as simple and elegant as the Eames Chair designs.

With the capability of " the master " mindset ( and certainly within the aspects of visual creativity being a button push away ) the idea that a Da Vinci mindset and personal capability to produce specific and unique output is now almost a side thought. The ability to have a Da Vinci determinate or timeline from creation to ownership is now ( via a typical 1st created, prototyped and manufactured model - and as always somehow within the context of the 4 year lab to real time window ) we are beginning to experience the technological capability of Ai to completely take over the idea of creativity as a value based exponent for new product development efforts. What does this mean? That an infinite creativity capability means that the idea of uniqueness may all but be over, that small changes to any existing X means the original ( no matter how " unique " ) can be determined to be not similar in any way to new productions and thus indefensible.

How this becomes factorized into the way where " new " and " unique " outputs are delivered, and especially within the context of physical products could be that medium-performing ( complexity ) groups - those who are more likely to involve and reply on online behaviors may be substantially more productive at avoiding specific cases of similarity to per-existing manufactured products versus situations where low-performing ( complexity ) groups and who likely to conduct design and new product development activities through verbal discourses and intense conversational efforts, may, no longer be rewarded for " uniqueness " not generated by Ai systems. The reliance on " activity " in the creative process ( and where generative Ai output is used to evaluate and increase " uniqueness " ) regardless of exact uniqueness becomes an issue.

Where specific and known product development lifetimes ( of use ) and value were previously well know ( and in some cases lengthy ) the time window for " newness " to leap over existing XYZ new product development efforts is shrinking at an incredible pace. Why? Because uniqueness ( in many aspects ) can no longer be protected and where generative Ai becomes both the emancipator and the limiting factor to ownership of breakthroughs.

 

Share on Linked-In       Email to a friend       Share with a friend on Facebook       Tweet on Twitter
   
   
      
###  
   
   
#iGNITIATE #Design #DesignThinking #DesignInnovation #IndustrialDesign #iGNITEconvergence #iGNITEprogram #DesignLeadership #LawrenceLivermoreNationalLabs #NSF #USNavy #EcoleDesPonts  #Topiade #LouisVuitton #WorldRetailCongress #REUTPALA #WorldRetailCongress #OM #Fujitsu #Sharing #Swarovski #321-Contact #Bausch&Lomb #M.ONDE #SunStar

 

 


Monday 29 April 2024

Does Innovation Theater + Innovation Choir = Innovation Compounding ? Often.

For almost 25yrs, a simple seminal study tied innovation ( and in it's loosest definition ) to the capacity that in order to increase the acceptance of what was considered " breakthrough "  when future backward consistent group coordination was at play, the reality became that this is innovation compounding. And, when it's executed as a group multiple ecosystem effort where this can signal a dilution often it becomes a further concentration of said breakthrough / collapse of value chains thus very closely mirroring the exact definition of innovation. Most of the time.

iGNITIATE : Does Innovation Theater + Innovation Choir = Innovation Compounding? Often.

In Competition, Innovation And Increasing Returns we see the underpinnings of the reality that ( and from many other past articles about similar topics ) that the idea of consortia, vertical ( and known ) where specific functional user groups that share a collective need and also acceptance of risk in the face of future specific 3, 5, 10+ year time frames, mean category-defining transformational capabilities have a tendency to become real world applicability much sooner than expected. With Futures Design tools and foreseen futures that codify the barely define into future functionality that is buildable ( if not fully usable ) today the process feeds upon itself if and only if end users see " the inevitable " that is on the way, eg. connected computing, wireless communications, neural networks, non von Neumann computing architectures, leading in no short form to synthetic biological computing.

Where this has been referred to as Skunkworks Singularity efforts, it is the capability and necessity to broaden and codify the unknown into the shortly doable ( even in it's most rudimentary forms ) that allows a choir of similar voices ( and which can easily be competitors ) to push acceptable use into broader awareness. What become particularly interesting is how organizations ( internally as in the capability of intrapreneurship ) also past a certain size become, themselves ( due to their reach across multiple and diverse areas of industrial reach ) become a choir of their own coordinated capability.

In the case of  Amendola, Gaffard and Musso's findings we see that this can not only be something that takes place inside one organization but sometimes across multiple organizations and even when incrementalism is lauded as breakthrough but only when ( seemingly ) these innovations are consistent. Consistency itself becomes a form of innovation realization., What can be ( and is often the case ) ignored are how these " breakthroughs " are a function of acceptable ( in relation to Physics ) doable output that can be tricked out as if a golden path were already laid. Examples of this can not only be seen in the evolution and adoption ( in the sense of the long tail model that is investigation and discovery ) of neuromorphic computing ( a completely different form of the current, silicon based, von Neumann computing architectures ) and, clearly, the use of standard silicon based computing architectures. Where Neuromorphic computing diverges from the standard paradigm a new future, a wholly, self contained 3rd self operating system ( and the hardware to support it ) is just one example of the compounding that that is present and evolving as we speak.

 

Share on Linked-In       Email to a friend       Share with a friend on Facebook       Tweet on Twitter
   
   
      
###  
   
   
#iGNITIATE #Design #DesignThinking #DesignInnovation #IndustrialDesign #iGNITEconvergence #iGNITEprogram #DesignLeadership #LawrenceLivermoreNationalLabs #NSF #USNavy #EcoleDesPonts  #Topiade #LouisVuitton #WorldRetailCongress #REUTPALA #WorldRetailCongress #OM #Fujitsu #Sharing #Swarovski #321-Contact #Bausch&Lomb #M.ONDE #SunStar

 

 

Sunday 31 March 2024

Collaborative Cognition = Unblocking Uncertainty

When the absence for consensus becomes the limiting factor to hurdling innovation roadblocks, what other mechanisms ease collaborative cognition ? Here's several for innovation plasticity.



In the race for interconnected innovation instances that continually lead to further convergence ( as divergences are a daily occurrence and also part of the design, engineering and innovation process ) we see that so to enable alternative mechanisms for lab based breakthroughs and to make it past the gauntlet of ( within large organization processes ) well honed, ' hold on, explain how that will work with / within my group ' situations, the key to success often resides in not only collaborative underpinnings but also the unblocking of known or subconsciously accepted risks. This is often separate and unrelated to enumerating the steps and changes necessary to make a specific modules work, solving technically ( and temporarily ) " impossible " situations and dealing with lab bench challenges. It is the steps in-between lab bench science and manufacturable usability that these challenges ultimately rear their ugly head.

Where we see this particularly well articulated is how Systemic Innovation Designers Through Informal and Collaborative Activities drive formal and specific processes which allows for the quantization of ever changing user attitudes often reflected as ' needs ' in certain circumstances: sometimes early on in the design processes. This is embodied in the further process of transversal competencies mediated by digital tools: telepresence, simultaneous collaboration, and synchronous and asynchronous communication which ultimately ( if done properly ) lead of convergences through a systematic peristalsis. When pushed or more aptly in today's language, " enabled " through effort.

This has been echoed for more than 20 years ( ten proceeding the original publication and then after ) where in Facilitating Innovation Through Cognitive Mapping of Uncertainty we see the systematic need for specific Skunkworks frameworks that can encompass the incredible level of uncertainty in early stages of defining breakthrough efforts ( from lab bench science ) through the engineering process. This mixing and cognitive separation, refinement, leads to the formation of what to do, what not to do weavings allowing the underpinning of what some have referred to as likeness lillypads - allowing the further connection between what has worked, may work, cannot work, and will not work environments. Moreover we see how an originally seemingly impermeable footpath to alternative future scenario directions can and often does allow for existing and transitionary system to take hold to forge those paths, and which some refer to as innovation plasticity.

With the notion of breaking barriers ( both conscious and unconscious ) in not only end users but to those involved in the delivery of specific components / end user products as the key factor to fostering the highest levels of convergence capabilities inside and post lab bench science validation, it seems there are many keys needed to increase the unblocking of uncertainty and which means an adherence to the need to foster ( at every level of the innovation effort ) a constant and clear mode of collaboration as well as a willingness to bend the rules - yet another type of innovation definition.

 

Share on Linked-In       Email to a friend       Share with a friend on Facebook       Tweet on Twitter
   
   
      
###  
   
   
#iGNITIATE #Design #DesignThinking #DesignInnovation #IndustrialDesign #iGNITEconvergence #iGNITEprogram #DesignLeadership #LawrenceLivermoreNationalLabs #NSF #USNavy #EcoleDesPonts  #Topiade #LouisVuitton #WorldRetailCongress #REUTPALA #WorldRetailCongress #OM #Fujitsu #Sharing #Swarovski #321-Contact #Bausch&Lomb #M.ONDE #SunStar

 

Thursday 29 February 2024

When Consensus is Conceded Innovation Isn't Interrupted

What is the biggest hurdle for innovation to overcome? Consistent convergence. But how in the face of constant concern, even when it's concern for what has already been done, when it's still said it can't be done ? Here's how.

iGNITIATE - When Consensus is Conceded Innovation Isn't Interrupted

Design, Design Thinking, Design Futures, Detailed Design and Design Engineering is, together, the basis for, and most flexible tools of, constraint based morphing of input parameters into working end solutions. In other, what could be considered " unhelpful ways " we separately see the advent of Ai capabilities ( and especially as related to non-physics based output ) which can turn just about anything ( image and video wise ) into anything else and connect them together. It is here that the terms " hallucinations " for given data output from search engines / search engine data comes into being. It is also here where these Ai systems output demonstrate that the fundamental vocabulary ( and specifically via the multitude of search engine URL results produced ) can easily become obsurified and more, where In the past we saw a ( roughly ) well documented history of where data " results " came from, where now however the tables have completely turned. But how? And how can we utilize this for the further efforts of innovation activities ?

In the past the idea of innovation ( and again from the root definition of the word enumerated by Schumpeter ) we see that invention (conceiving a new idea or process - that works on a lab bench and as governed by the laws of physics ), turns into innovation (arranging the manufacturing and producibility requirements for implementing an invention - and where Schumpeter seems to have just redefined entrepreneurship in his original description ), and onto diffusion (whereby people observing the new discovery adopt [ purchase via entrepreneurship ] or imitate [ by copying to also sell for their own ] where Schumpeter has it correct ) seems now to have been upended. Ai and specifically generative Ai ( in many forms ) lets anyone jump right from invention to an infinite number of diffusions ( and call them all new ), and, at the expense of perpetuating concern for adoption which in fact is the exact limiter to innovation as discussed above. That which is not known for long enough to be encoded with the guise of safety is quite [ as the baby is ] often thrown out with the bathwater as the saying goes and where innovation fails.

When time frames for diffusion ( as in the above definition ) are extended and slowed down, when users are unable to discern viability and usability easily, it is here that the limiting factor to uptake is effected and thus barriers are further created that is at the constant forefront of innovation efforts. It is here where it is necessary to diverge from Schumpeter's definition of " innovation " as the entrepreneurial process, and, a more accurate form of the definition of innovation comes into play: the effort of diluting and dispersing concern so as to actually reach consistent consensus and thus increasing the capability for value in a new something to take hold. Simple examples are that of the Xerox / xerography photo copying system started several years before 1938 when the technology was finally made to " work " in 1938 and thus the lab science / bench science invention phase was completed. It wasn't until 1944 that it was " noticed " as important and where it took until 1948 where it was deemed a " successful " development effort ( most likely due to the innovation phase to be able to work within the context of a large manufacturable manner was completed ) and yet not available for sale until 1950, possibly 15 years after it's conception, design and initial " working " mode. It is here we see the efforts of consistent consensus even after the math and physics were validated. Was it consensus pressures leading groups to reject this useful ( and thus patentable ) idea rather than move forward with them the issue? In many cases absolutely and as detailed in Greater Variability In Judgements Of The Value Of Novel Ideas in Nature Magazine. How then can these situations be mitigated so as to close such gaps in a better way? Interestingly it is as simple as unwavering concession.

Design and thus it's constraint based counterpart, concession, is inherently limited by ( as in the case case of raw physics ) the necessity that some configurations of materials cannot operate in a way that may have been initial conceived - " there are somethings you just can't make plastic do " as Steve Jobs famously said. However when it's possible to keep 5000 things in play at the same time and still be able to reach functionality that increases usability and within the context of efficient manufacturability we see 2nd order breakthroughs / usability and the elevation of concern happen. And, it is this relentless process of configuring and re-configuring constraints that ultimately allows innovations to persist in the face of " concern " being often veiled within individual confidences of need that may not ( and often are not ) a form of the actual evolution of innovation to be addressed and delivered upon. Regardless these hurdles cannot be ignored and thus it's Design, Design Thinking, Design Futures, Detailed Design and Design Engineering the bring about the quickest convergence capability to occur.

 

Share on Linked-In       Email to a friend       Share with a friend on Facebook       Tweet on Twitter
   
   
      
###  
   
   
#iGNITIATE #Design #DesignThinking #DesignInnovation #IndustrialDesign #iGNITEconvergence #iGNITEprogram #DesignLeadership #LawrenceLivermoreNationalLabs #NSF #USNavy #EcoleDesPonts  #Topiade #LouisVuitton #WorldRetailCongress #REUTPALA #WorldRetailCongress #OM #Fujitsu #Sharing #Swarovski #321-Contact #Bausch&Lomb #M.ONDE #SunStar

 

Wednesday 31 January 2024

MemRistors + MemGTP + MemOS Might Mean MemInnovation

With ( possibly ) mathematically, the most radical technological upheaval in adoption that humanity has seen, Ai is poised to make MemInnovation happen shortly. Can such a thing be said ? Yes.

iGNITIATE : MemRistors + MemGTP + MemOS Might Mean MemInnovation

Via the creation and culmination of years of research into non-silicon and semi-silicon chip fabrication designed brain like, neuronal processor and processing technology, and as detailed in Dynamical Memristors For Higher-Complexity Neuromorphic Computing we see the capability that now ( and even in some software systems alone ) full personalized AI with long-term memory ( thus the 3rd self ); self-editing memory; infinite context windows; access to unlimited data; customizable tools; and thus, essentially, long-term memory is already here. With the stitching together of technologies such as MemRistors / Neuromorphing computing and interfaces such as MemGTP a new fabric of systems and what is being called MemOS and thus gtpOS or as referenced in the past the idea of an ownable version of The Third Self has come into usability in it’s early forms. Think the 1st version of Douglas Engelbart’s ” The Mother of All Demos ” or Xerox PARC’s Alto GUI that made it’s way into the creation of the 1st Mac computers by Apple.

With the specific ability of MemRistors + MemGTP + MemOS to move designer and innovation practitioners into a capability can that can be considered MemInnovation, we see the further the capability of participatory innovation & generative design system can directly effects the way advanced R&D and design efforts evolve. More specifically when non-linear and novel technological and design centric integration takes place Skunkworks models of R&D, innovation and design efforts can evolve at a pace faster than the technology produced – all due to the design oriented aspect of the technology. In this case there is the possibility to say that systems and technologies such as MemRistors + MemGTP + MemOS create integrations, platforms, that perform, essentially, second order derivative thus almost creating a rate of change to find ( utilize ) the acceleration of an object ( when velocity – the uptake of such specific technologies ) are given by first derivative.

Further in Disruptive Science Has Declined – Even As Papers Proliferate in Nature magazine, and with an incredible sample size of 45 million manuscripts along with utilizing the data from 3.9 million patents we see the notion ( where the long held bastion of what is determined to be ” innovative ” ) that converging technologies such as MemRistors + MemGTP + MemOS even more rapidly than before are pushing out the notion ( and acceptance ) of what is disruption and thus true innovation. Where this then becomes particularly interesting is the way in which firms that are investing in said patents almost to create an unbreakable ( if Patented fast enough ) barrier to creativity as thus inadvertently discouraging radical design and innovation efforts unless the value of the output to the larger owned system takes place and which is of incredible value to the firms that focus on the creation and use of such efforts. Thus the idea of ” innovation is dead, long live innovation ” is fully rooted in the development of Neuromorphic Computing / MemRistors + MemGTP + MemOS and which is here / right around the corner.

 

Share on Linked-In       Email to a friend       Share with a friend on Facebook       Tweet on Twitter
   
   
      
###  
   
   
#iGNITIATE #Design #DesignThinking #DesignInnovation #IndustrialDesign #iGNITEconvergence #iGNITEprogram #DesignLeadership #LawrenceLivermoreNationalLabs #NSF #USNavy #EcoleDesPonts  #Topiade #LouisVuitton #WorldRetailCongress #REUTPALA #WorldRetailCongress #OM #Fujitsu #Sharing #Swarovski #321-Contact #Bausch&Lomb #M.ONDE #SunStar

 

 

---