Performance indicators | SmallBiz.com - What your small business needs to incorporate, form an LLC or corporation! https://smallbiz.com INCORPORATE your small business, form a corporation, LLC or S Corp. The SmallBiz network can help with all your small business needs! Mon, 26 Jun 2023 12:08:01 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://smallbiz.com/wp-content/uploads/2021/05/cropped-biz_icon-32x32.png Performance indicators | SmallBiz.com - What your small business needs to incorporate, form an LLC or corporation! https://smallbiz.com 32 32 When to Give Employees Access to Data and Analytics https://smallbiz.com/when-to-give-employees-access-to-data-and-analytics/ Wed, 24 May 2023 12:25:15 +0000 https://smallbiz.com/?p=107354

As business leaders strive to get the most out of their analytics investments, democratized data science often appears to offer the perfect solution. Using analytics software with no-code and low-code tools can put data science techniques into virtually anyone’s hands. In the best scenarios, this leads to better decision making and greater self-reliance and self-service in data analysis — particularly as demand for data scientists far outstrips their supply. Add to that reduced talent costs (with fewer high-cost data scientists) and more scalable customization to tailor analysis to a particular business need and context.

However, amid all the discussion around whether and how to democratize data science and analytics, a crucial point has been overlooked. The conversation needs to define when to democratize data and analytics, even to the point of redefining what democratization should mean.

Fully democratized data science and analytics presents many risks. As Reid Blackman and Tamara Sipes wrote in a recent article, data science is difficult and an untrained “expert” cannot necessarily solve hard problems, even with good software. The ease of clicking a button that produces results provides no assurance that the answer is good — in fact, it could be very flawed and only a trained data scientist would know.

It’s Only a Matter of Time

Even with these reservations, however, democratization of data science is here to stay, as evidenced by the proliferation of software and analytics tools. Thomas Redman and Thomas Davenport are among those who advocate for the development of “citizen data scientists,” even screening for basic data science skills and aptitudes in every position hired.

Democratization of data science, however, should not be taken to the extreme. Analytics need not be at everyone’s fingertips for an organization to flourish. How many outrageously talented people wouldn’t be hired simply because they lack “basic data science skills?” It’s unrealistic and overly limiting.

As business leaders look to democratize data and analysis within their organizations, the real question they should be asking is “when” it makes the most sense. This starts by acknowledging that not every “citizen” in an organization is comparably skilled to be a citizen data scientist. As Nick Elprin, CEO and co-founder of Domino Data Labs, which provides data science and machine learning tools to organizations, told me in a recent conversation, “As soon as you get into modeling, more complicated statistical issues are often lurking under the surface.”

The Challenge of Data Democratization

Consider a grocery chain that recently used advanced predictive methods to right-size its demand planning, in an attempt to avoid having too much inventory (resulting in spoilage) or too little (resulting in lost sales). The losses due to spoilage and stockouts were not enormous, but the problem of curtailing them was very hard to solve — given all the variables of demand, seasonality, and consumer behaviors. The complexity of the problem meant that the grocery chain could not leave it to citizen data scientists to figure it out, but rather leverage a team of bona fide, well-trained, data scientists.

Data citizenry requires a “representative democracy,” as Elprin and I discussed. Just as U.S. citizens elect politicians to represent them in Congress (presumably to act in their best interests in legislative matters), so too organizations need the right representation by data scientists and analysts to weigh in on issues that others simply don’t have the expertise to address.

In short, it’s knowing when and to what degree to democratize data. I suggest the following five criteria:

Think about the “citizen’s” skill level: The citizen data scientist, in some shape and form, is here to stay. As stated earlier, there simply aren’t enough data scientists to go around, and using this scarce talent to address every data issue isn’t sustainable. More to the point, democratization of data is key to inculcating analytical thinking across the organization. A well-recognized example is Coca-Cola, which has rolled out a digital academy to train managers and team leaders, producing graduates of the program who are credited with about 20 digital, automation, and analytics initiatives at several sites in the company’s manufacturing operations.

However, when it comes to engaging in predictive modeling and advanced data analysis that could fundamentally change a company’s operations, it’s crucial to consider the skill level of the “citizen.” A sophisticated tool in the hands of a data scientist is additive and valuable; the same tool in the hands of someone who is merely “playing around in data” can lead to errors, incorrect assumptions, questionable results, and misinterpretation of outcomes and conclusions.

Measure the importance of the problem: The more important a problem is to the company, the more imperative it is to have an expert handling the data analysis. For example, generating a simple graphic of historical purchasing trends can probably be accomplished by someone with a dashboard that displays data in a visually appealing form. But a strategic decision that has meaningful impact on a company’s operations requires expertise and reliable accuracy. For example, how much an insurance company should charge for a policy is so deeply foundational to the business model itself that it would be unwise to relegate this task to a non-expert.

Determine the problem’s complexity: Solving complex problems is beyond the capacity of the typical citizen data scientist. Consider the difference between comparing customer satisfaction scores across customer segments (simple, well-defined metrics and lower-risk) versus using deep learning to detect cancer in a patient (complex and high-risk). Such complexity cannot be left to a non-expert making cavalier decisions — and potentially the wrong decisions. When complexity and stakes are low, democratizing data makes sense.

An example is a Fortune 500 company I work with, which runs on data throughout its operations. A few years ago, I ran a training program in which more than 4,500 managers were divided into small teams, each of which was asked to articulate an important business problem that could be solved with analytics. Teams were empowered to solve simple problems with available software tools, but most problems surfaced precisely because they were difficult to solve. Importantly, these managers were not charged with actually solving those difficult problems, but rather collaborating with the data science team. Notably, these 1,000 teams identified no less than 1,000 business opportunities and 1,000 ways that analytics could help the organization.

Empower those with domain expertise: If a company is seeking some “directional” insights — customer X is more likely to buy a product than customer Y — then democratization of data and some lower-level citizen data science will probably suffice. In fact, tackling these types of lower-level analyses can be a great way to empower those with domain expertise (i.e., being closest to the customers) with some simplified data tools. Greater precision (such as with high-stakes and complex issues) requires expertise.

The most compelling case for precision is when there are high-stakes decisions to be made based on some threshold. If an aggressive cancer treatment plan with significant side effects were to be undertaken at, for instance, greater than 30% likelihood of cancer, it would be important to differentiate between 29.9% and 30.1%. Precision matters — especially in medicine, clinical operations, technical operations, and for financial institutions that navigate markets and risk, often to capture very small margins at scale.

Challenge experts to scout for bias: Advanced analytics and AI can easily lead to decisions that are considered “biased.”  This is challenging in part because the point of analytics is to discriminate — that is, to base choices and decisions on certain variables. (Send this offer to this older male, but not to this younger female because we think they will exhibit different purchasing behaviors in response.) The big question, therefore, is when such discrimination is actually acceptable and even good — and when it is inherently problematic, unfair, and dangerous to a company’s reputation.

Consider the example of Goldman Sachs, which was accused of discriminating by offering less credit on an Apple credit card to women than to men. In response, Goldman Sachs said it did not use gender in its model, only factors such as credit history and income. However, one could argue that credit history and income are correlated to gender and using those variables punishes women who tend to make less money on average and historically have had less opportunity to build credit. When using output that discriminates, decision-makers and data professionals alike need to understand how the data were generated and the interconnectedness of the data, as well as how to measure such things as differential treatment and much more. A company should never put its reputation on the line by having a citizen data scientist alone determine whether a model is biased.

Democratizing data has its merits, but it comes with challenges. Giving the keys to everyone doesn’t make them an expert, and gathering the wrong insights can be catastrophic. New software tools can allow everyone to use data, but don’t mistake that widespread access for genuine expertise.

]]>
Generative AI Will Change Your Business. Here’s How to Adapt. https://smallbiz.com/generative-ai-will-change-your-business-heres-how-to-adapt/ Wed, 12 Apr 2023 12:25:47 +0000 https://smallbiz.com/?p=99936

It’s coming. Generative AI will change the nature of how we interact with all software, and given how many brands have significant software components in how they interact with customers, generative AI will drive and distinguish how more brands compete.

In our last HBR piece, “Customer Experience in the Age of AI,” we discussed how the use of one’s customer information is already differentiating branded experiences. Now with generative AI, personalization will go even further, tailoring all aspects of digital interaction to how the customer wants it to flow, not how product designers envision cramming in more menus and features. And then as the software follows the customer, it will go to places that range beyond the tight boundaries of a brand’s product. It will need to offer solutions to things the customer wants to do. Solve the full package of what someone needs, and help them through their full journey to get there, even if it means linking to outside partners, rethinking the definition of one’s offerings, and developing the underlying data and tech architecture to connect everything involved in the solution.

Generative AI can “generate” text, speech, images, music, video, and especially code. When that capability is joined with a feed of someone’s own information, used to tailor the when, what, and how of an interaction, then the ease by which someone can get things done, and the broadening accessibility of software, goes up dramatically. The simple input question box that stands at the center of Google and now, of most generative AI systems, such as in ChatGPT and DALL-E 2, will power more systems. Say goodbye to drop down menus in software, and the inherently guided restrictions they place on how you use them. Instead, you’ll just see: “What do you want to do today?” And when you tell it what you want to do, it will likely offer some suggestions, drawing upon its knowledge of what you did last time, what triggers the systems knows about your current context, and what you’ve already stored in the system as your core goals, such as “save for a trip,” “remodel our kitchen,” “manage meal plans for my family of five with special dietary needs,” etc.

Without the boundaries of a conventional software interface, consumers will just want to get done what they need, not caring whether the brand behind the software has limitations. The change in how we interact, and what we expect, will be dramatic, and dramatically more democratizing.

So much of the hype on generative AI has focused on its ability to generate text, images, and sounds, but it also can create code to automate actions, and to facilitate pulling in external and internal data. By generating code in response to a command, it facilitates the short cut for a user that takes them from a command to an action that simply just gets done. No more working through all of the menus in the software. Even questions into and analyses of the data stored in an application will be easily done just by asking: “Who are the contacts I have not called in the last 90 days?” or “When is the next time I am scheduled to be in NYC with an opening for dinner?” To answer these questions now, we have to go into an application and gather data (possibly manually) from outside of the application itself. Now, the query can be recognized, code created, possibilities ranked, and the best answer generated. In milliseconds.

This drastically simplifies how we interact with what we think of as today’s applications. It also enables more brands to build applications as part of their value proposition. “Given the weather, traffic, and who I am with, give me a tourist itinerary for the afternoon, with an ongoing guide, and the ability to just buy any tickets in advance to skip any lines.” “Here’s my budget, here’s five pictures of my current bathroom, here’s what I want from it, now give me a renovation design, a complete plan for doing it, and the ability to put it out for bid.” Who will create these capabilities? Powerful tech companies? Brands who already have relationships in their relevant categories? New, focused disruptors? The game is just starting, but the needed capabilities and business philosophies are already taking shape.

A Broader Journey with Broader Boundaries

In a world where generative AI, and all of the other evolving AI systems proliferate, building one’s own offering requires focusing on the broadest possible view of one’s pool of data, of the journeys you can enable, and the risks they raise:

Bring data together.

Solving for a customer’s complete need will require pulling from information across your company, and likely beyond your boundaries. One of the biggest challenges for most applications, and actually for most IT departments, is bringing data together from disparate systems. Many AI systems can write the code needed to understand the schemas of two different databases, and integrate them into one repository, which can save several steps in standardizing data schema. AI teams still need to dedicate time for data cleansing and data governance (arguably even more so), for example, aligning on the right definitions of key data features. However, with AI capabilities in hand, the next steps in the process to bring all the data together become easier.

Narrative AI, for example, offers a marketplace for buying and selling data, along with data collaboration software that allows companies to import data from anywhere into their own repositories, aligned to their schema, with merely a click. Data from across a company, from partners, or from sellers of data, can be integrated and then used for modeling in a flash.

Combining one’s own proprietary data with public data, data from other available AI tools, and from many external parties can serve to dramatically improve the AI’s ability to understand one’s context, predict what is being asked, and have a broader pool from which to execute a command.

The old rule around “garbage in, garbage out” still applies, however. Especially when it comes to integrating third-party data, it is important to cross-check the accuracy with internal data before integrating it into the underlying data set. For example, one fashion brand recently found that gender data purchased from a third-party source didn’t match its internal data 50% of the time, so the source and reliability really matters.

The “rules layer” becomes even more critical.

Without obvious restrictions on what a customer can ask for in an input box, the AI needs to have guidelines that ensure it responds appropriately to things beyond its means or that are inappropriate. This amplifies the need for a sharp focus on the rules layer, where the experience designers, marketers and business decision makers set the target parameters for the AI to optimize.

For example, for an airline brand that leveraged AI to decide on the “next best conversation” to engage in with customers, we set rules around what products could be marketed to which customers, what copy could be used in which jurisdictions, and rules around anti-repetition to ensure customers didn’t get bombarded with irrelevant messages.

These constraints become even more critical in the era of generative AI. As pioneers of these solutions are finding, customers will be quick to point out when the machine “breaks” and produces non-sensical solutions. The best approaches will therefore start small, will be tailored to specific solutions where the rules can be tightly defined and human decision makers will be able to design rules for edge cases.

Deliver the end to end journey, and the specific use cases involved.

Customers will just ask for what they need, and will seek the simplest and/or most cost-effective way to get it done. What is the true end goal of the customer? How far can you get? With the ability to move information more easily across parties, you can build partnerships for data and for execution of the actions to help a customer through their journey, therefore, your ecosystem of business relationships will differentiate your brand.

In his impressive demo of how Hubspot is incorporating generative AI into “ChatSpot,” Dharmesh Shah, CTO and founder of Hubspot, lays out how they are mingling the capabilities of HubSpot with OpenAI, and with other tools. Not only does he show Hubspot’s interface reduced to just a single text entry prompt, but he also shows new capabilities that extend well beyond Hubspot’s current borders. A salesperson seeking to send an email to a business leader at a target company can use ChatSpot to perform research on the company, on the target business leader, and then draft an email that incorporates both information from the research and from what it knows about the salesperson themselves. The resulting email draft can then be edited, sent, and tracked by HubSpot’s system, and the target business leader automatically entered into a contact database with all associated information.

The power of connected information, automatic code creation, and generated output is leading many other companies to extend their borders, not as conventional “vertical,” or “horizontal” expansion, but as “journey expansion.” When you can offer “services” based on a simple user command, those commands will reflect the customer’s true goal and the total solution they seek, not just a small component that you may have been dealing with before.

Differentiate via your ecosystem.

Solving for those broader needs inevitably will pull you into new kinds of partner relationships. As you build out your end-to-end journey capabilities, how you construct those business relationships will be critical new bases for strategy. How trustworthy, how well permissioned, how timely, how comprehensive, how biased is their data. How will they use data your brand sends out? What is the basis of your relationship, quality control, and data integration? Pre-negotiated privileged partnerships? A simple vendor relationship? How are you charging for the broader service and how will the parties involved get their cut?

Similar to how search brands like Google, ecommerce marketplaces like Amazon, and recommendation engines such as Trip Advisor become gateways for sellers, more brands can become front-end navigators for a customer journey if they can offer quality partners, experience personalization, and simplicity. CVS could become a full health network coordinator that health providers, health tech, wellness services, pharma, and other support services will plug into. When its app can let you simply ask: “How can you help me lose 30 pounds,” or “How can you help me deal with my increasing arthritis,” the end-to-end program they can generate and then completely manage, through prompts to you and information passed around their network, will be a critical differentiator in how they, as a brand, build loyalty, capture your data, and use that to keep increasing service quality.

Prioritize safety, fairness, privacy, security, and transparency.

The way you manage data becomes part of your brand, and the outcomes for your customers will have edge cases and bias risks that you should seek out and mitigate. We are all reading stories of how people are pushing Generative AI systems, such as ChatGPT, to extremes, and getting back what the application’s developers call “hallucinations,” or bizarre responses. We are also seeing responses that come back as solid assertions of wrong facts. Or responses that are derived from biased bases of data that can lead to dangerous outcomes for some populations. Companies are also getting “outed” for sharing private customer information with other parties, without customer permissions, clearly not for the benefit of their customers per se.

The risks — from the core data, to the management of data, to the nature of the output of the generative AI — will simply keep multiplying. Some companies, such as American Express, have created new positions for chief customer protection officers, whose role is to stay ahead of potential risk scenarios, but more importantly, to build safeguards into how product managers are developing and managing the systems. Risk committees on corporate boards are already bringing in new experts and expanding their purviews, but more action has to happen pre-emptively. Testing data pools for bias, understanding where data came from and its copyright/accuracy/privacy risks, managing explicit customer permissions, limiting where information can go, and constantly testing the application for edge cases where customers could push it to extremes, are all critical processes to build into one’s core product management discipline, and into the questions that top management routinely has to ask. Boards will expect to see dashboards on these kinds of activities, and other external watchdogs, including lawyers representing legal challenges, will demand them as well.

Is it worth it? The risks will constantly multiply, and the costs of creating structures to manage those risks will be real. We’ve only begun to figure out how to manage bias, accuracy, copyright, privacy, and manipulated ranking risks at scale. The opacity of the systems often makes it impossible to explain how an outcome happened if some kind of audit is necessary.

But nonetheless, the capabilities of generative AI are not only available, they are the fastest growing class of applications ever. The accuracy will improve as the pool of tapped data increases, and as parallel AI systems as well as “humans in the loop” work to find and remedy those nasty “hallucinations.”.

The potential for simplicity, personalization, and democratization of access to new and existing applications will pull in not only hundreds of start-ups, but will also tempt many established brands into creating new AI-forward offerings. If they can do more than just amuse, and actually take a customer through more of the requirements of their journey than ever before, and do so in a way that inspires trust, brands could open up new sources of revenue from the services they can enable beyond their currently narrow borders. For the right use cases, speed and personalization could possibly be worth a price premium. But more likely, the automation abilities of AI will pull costs out of the overall system and put pressure on all participants to manage efficiently, and compete accordingly.

We are now opening up a new dialogue between brands and their customers. Literally. Not like the esoteric descriptions of what happened in the earlier days of digital interaction. Now we are talking back and forth. Getting things done. Together. Simply. In a trustworthy fashion. Just how the customer wants it. The race is on to see which brands can deliver.

]]>