Insights

By David Paschane 13 Nov, 2017
The ability to improve oneself is a high-value capability. 

When the capability is organized into the culture and structure of an organization its a powerful and sustainable benefit.  A benefit that has multiple positive effects.  There are many related concepts in business management and organizational sciences.  Our attached concept diagram is how we examine and explain the maturity of an organization's capability to improve itself.  The concept is based on our PASS discipline/architecture.  It is how we blend methodological rigor with technological advancement.   

Aplin Labs Business Capability for Performance and Outcome Improvement
By David Paschane 01 Nov, 2017
There are many papers that claim to be useful frameworks for government reform.  It seems that every major consulting firm puts one out during major transitions, such as a new Federal Administration. 

Our paper is worth reading because it addresses three important themes: 

  • Data quality and access
  • Employee roles in change
  • Rigor of advanced analytics
We hope you read it and share it with your fellow government leaders:  " Data-Powered Leadership Reform 2017 "
By David Paschane 18 Aug, 2017
Architects - those who design buildings - are amazing people.   They combine imagination and math to anticipate possible futures.  Their work is confused with strategic thinking, where linear plans are needed.  In contrast, architects consider the many, to many, to many possibilities throughout a design, and how, in combination, these will make something wonderful. 

My grandfather was an architect.   He designed and oversaw the construction of many buildings in Southern California.  As a young kid, I recall walking with him, wearing a hardhat, and watching teams build Fashion Island in Newport Beach.  I don't know if he imagined that it would look like the image below.  His drawings were daytime pastels, light buildings, few cars, and happy strolling people.  I did not spend a lot of time looking at the stacks of blueprints, but I remember seeing incredible details in complex engineering diagrams.  

I was impressed.  Not as much with the thick layers of drawings, but how he was thinking about the future capability he was helping create.  And, indeed he was helping, as he spent a lot of time meeting with the building and engineering teams.  He fostered inputs, from everyone.  He encouraged learning from every direction. 
By David Paschane 17 Aug, 2017
If you needs fraud detection, or fraud prevention, there are many options to use, and the recent emphasis is Artificial Intelligence (AI).

Here, I summarize the AI perspective, and then propose methods that may be more feasible.

The AI promoters imply that a computer does all the thinking for us, thus it will not make errors, and is more thorough. AI tools operate on a simple concept for handling data: An algorithm, with many coded parameters, is used to make decisions about data that have many-to-many relationships; meanwhile, the program is scanning data sets to inform the algorithm so it can complete decisions with more insights.

AI creates the appearance of “smartness,” as it is faster and more comprehensive than people. However, the fraud detection algorithms and their parameters are functional designs based on the work of human analysts.  In fact, analysts may do it much better. 

Fraud detection and prevention is a human science problem, in that an analyst is interpreting statistical analyses of data patterns, given knowledge about how such data could be manipulated by others.  It is enriched by applying robust analytic disciplines, such as Performance Architectural Science Systems (PASS), which is a focus on the complex, layered factors that allow changes to happen, including fraud. 

The first objectives are to rationalize data, measure and test pattern variability, and verify the possible explanations, within the scope of possible human factors. Then, create parameters for algorithms, where these are used to repeat the baseline work of the analyst.

Once the algorithms prove their worth, by testing and optimizing the means of anticipating many possible fraud signals or risks, then they can be engineered into other tools, such as a workflow platforms or financial transactions. The algorithms can be made more robust through recursive testing, but their foundation is developed in applied analyses.

While analytic work is advanced, it is not necessarily what is implied in AI.

The basic steps are:

  1. Rationalize data
  2. Test for variability patterns
  3. Verify explanations
  4. Develop data parameters
  5. Build algorithms
  6. Test algorithms on various data sources
  7. Improve algorithms
  8. Integrate algorithms into data handling
  9. Automate algorithm operations 
Feasible methods are ones that combine the rigor of statistical measurement with the robustness of human interpretation. The best fraud detection and prevention is developed over time by anticipating risks due to human actions.

 The eight steps above outline a feasible process for combining and developing the most effective actions for addressing fraud.

Let's shine a bright light on those committing fraud.


By David Paschane 15 Jul, 2017

Government Leadership Needs Our Attention

"Among the 4 million people who operate, oversee, and audit federal operations, there are nearly 8,000 senior executives, who are the federal leaders responsible for the total cost and value of federal operations."

"Each leader has a common responsibility, regardless of their career rank, to (a) refine the operational capability, (b) improve the operational performance, and (c) optimize the operational outcomes."

The Times Are Changing

1. President Trump issued an Executive Order   [1] that required the Director of the Office of Management and Budget (OMB), Mick Mulvaney, to develop a plan for effective, efficient, and accountable operations throughout the federal government. 

2. Mr. Mulvaney issued an order  [2] to all federal agencies to develop plans for reforming their operational capability and performance, including the performance of their respective employees. 

3.  Outside of OMB, think tanks are publishing helpful, detailed government-wide reorganization plans recommending pragmatic cures to longstanding and pervasive problems in the federal government   [3]

4.  President Trump issued a Memorandum  [4] that established the White House Office of American Innovation (OAI), and OAI is charged with developing policies and plans to improve federal operations and their outcomes. 

5. The Government Performance and Reporting Modernization Act (GPRMA)  [6] established leadership support from politically appointed executives, namely the offices of Deputy Secretaries.

6. Through President Trump’s next U.S. Chief Performance Officer  [7]  federal leaders will have access to specialized support in resolving bureaucratization, adopting quality data, and organizing cross-operational inefficiencies.

7. The Digital Accountability and Transparency Act (DATA Act)  [8] required all federal spending information to be standardized and structured for open publication and bulk use, which enables the pursuit of operation-specific costs of business. 

8. The Data Coalition is pushing for the DATA Act’s full, source-level publication of agency-reported data in a  format available for bulk download and analysis by third-parties   [70]

9. The U.S. Government Accountability Office is actively assessing the leadership support in GPRMA   [9] , and has an implementation status report due on the DATA Act this November 2017.  

10. Federal leaders have a business case for pursuing critical initiatives that support government reform


By David Paschane 19 Apr, 2017
Technology worries people.   Specifically, the technology that changes work, such as Artificial Intelligence (AI), worries people. 

The real threat to one being valuable in work, is not technology, it is bureaucratization.  Tools can enhance one's value, but stifling, calcifying, stove-piping bureaucracy can dehumanize every employee -- the cog-in-the-wheel problem. 

What is often misunderstood about technology is that its greatest strength is in making quick analyses, especially complex ones with many potential inputs and outputs (e.g., algorithms).  When the analyses are performed and delivered through technology, we call it analytics. 

Ironically, very few organizations leverage analytics well.  They may have an analytic capability for a specific analysis, report, or routine question, but this is just the beginning of what is possible. 

Analytics can increase in both complexity and effectiveness.  

For example, descriptive analytics, such as how many patients were seen at a hospital last month, is low complexity and low effectiveness.   As the effectiveness increases, so does the complexity.  More advanced analytics include accounting measures, diagnostic statistics, predictive forecasting, and simulated prescriptions.  At each level, the need for information increases, and the mathematics in support of the analytics is more intense. 

Even more effective, are automated signals, automated decisions, and then artificial intelligence.  These all have similar effectiveness, in that they use embedded analytics to resolve questions as there is an indication of the need for an answer.   The question may not even be asked, it is simply required because of the data.   What is notable is that these three are increasingly complex, and as a result, expensive.   So, the most cost-effective analytics are automate signals and decisions. 

What this all means in the future of work, is that analytic executives have an incentive to engineer analytic technologies that are full of advanced algorithms, but as these support the work of specialized operators.  

There is little incentive to adopt artificial intelligence.  Too complex, not much more effective. 

The well-fitted analytics are often called cognitive technologies.  The best are those that meet the needs of operators and executives.   This future of work may be less layers of managers, thus less bureaucracy.  

Aplin Labs focuses on these 5 goals in our cognitive technologies: 

  1.   Advanced algorithms to drive dynamic transactions
  2.   Assessments to pinpoint causes of desired outcomes
  3.   Automation and augmentation of routine tasks, analyses, and documents
  4.   Optimized user signals to drive rapid tasks and management functions
  5.   Anticipate changes in conditional factors (e.g., legal compliance)
We see the future of work as a customization of these 5 goals. 
By David Paschane 19 Apr, 2017
There is plenty of buzz about cognitive analytics - what do you need to know?

The capability of large organizations is a combination of the value in its (a) teams of professional people and (b) analytically-driven tools.

If left to bureaucratic tendencies, the value of teams and tools diminish.

Reestablishing and extending the value under changing conditions is difficult. It requires a plan for enhancing cognitive technologies – the analytically-driven tools that reinforce people’s work within operational teams.

In the pursuit of advanced analytics, cognitive technologies are the most valuable and cost-effective, before pursuing full artificial intelligence.

Cognitive technologies are used to optimize and manage an operation, as well as automate or augment routine tasks. They are also used organizationally to support the development of employees and the efficiency of their interactive work.

It's that simple.
By David Paschane 19 Apr, 2017
Our story started in 1994, in the the Last Frontier , Anchorage Alaska!

A few of us human scientists pondered the question of why so many organizations fail to resist the risks of bureaucratization. 

We were watching noble leaders, in business, government, and civics, try to achieve bold goals, but were often, if not always, overwhelmed by the calcification of structure. 

Our questions and concerns inspired studies into the cause-effects that are layered in organizations.  We established Aplin Labs as our means of exploring research and development that specializes in organizational behavior, and how it can be enhanced through cognitive technologies, with effective anticipation of conditional factors.

We concentrate on the design and development of three cognitive technologies:

  1.   Online organizational capability assessments - know the deep causes of performance  
  2.   Operational management analytic platforms - amplify the power of dedicated teams
  3.   Organizational learning integrated networks - reinforce employees' emerging value

To this day, we continue to refine these tools, as a means of influencing layered dynamic factors in the capability of large organizations.

Our work is based on our well-vetted discipline, Performance Architectural Science Systems (PASS) - a method of engineering capabilities, while accounting for the effects of changes within organizational conditions, or layered factors.

And, we drink a lot of coffee.

Share by: