Our AI journey in 2023

Andrew Butel
-
November 2, 2023

ChatGPT has shown us that Generative AI is an awesome personal productivity tool, but how do we leverage it for our clients and how do you achieve commercial gains across your whole company?

This is a question we started asking earlier in the year.  But there were a couple of hurdles we had to get over before we could really start to explore the benefits of AI in our agency.

How do we make sure we are protecting our clients intellectual property, as well as our own IP, while using AI?

How do we manage the cost, considering every SaaS product we use wants to up-sell us their ‘own’ AI feature sets (all backed by the same basic LLM) at a premium, per-user, price-point?

An LLM is a large language model, which is a specific type of generative AI that works with words. To find out more, check out: https://ig.ft.com/generative-ai/

Step 1: An AI policy that makes sense

The first step on our journey was to sort out the data protection issues. We place a high priority on security and privacy, so we wanted to start with an AI policy that was going to be simple to implement and would help our team to avoid making mistakes.

Our policy defines a list of approved AI tools overseen by our CTO, based on five classifications:

  • Experimental - Only to be used within the bounds of the noted experiment.
  • Public - General prompts and searches, containing no business information.
  • General Business - approved for business in confidence and public data.
  • Confidential - approved for use for IP (including designs, code, etc) and confidential information.
  • Restricted - approved for use of live production data and personal data (both common and sensitive).

We also wanted to allow our clients, to opt-in to us using AI when it comes to creating their core IP, so we are now sending them an AI statement, outlining the risks and that we won't use AI in a way that impacts their core IP, without their permission.

Step 2: Providing a controlled environment

The next step was to provide our team with an approved controlled environment for consuming AI under our “confidential” classification. There were a few requirements for this:

  • We must be using OpenAI via API to avoid having our data trained into the LLM
  • We must be in control of our own data (not having it locked up in each SaaS product we use)
  • We must have our own URL with single-sign-on, making it very clear that this environment is owed by EndGame
  • We must be able to manage the cost

To achieve these requirements, we rolled out our own platform (called Hoist) that provided the team with a chatGPT interface, in a controlled environment.

For more information about using AI within your own business, you can download our white paper on innovating safely with AI.

Step 3: Getting some business wins

Getting some business wins from using chatGPT took a few months. After a month, we saw that about half of the team were using chatGPT on a daily basis, but mostly just as an "alternative search engine" or for coding assistance. We weren't seeing a general uplift in AI capability yet, so we did two things:

We created the concept of "apps" inside Hoist. The main thing we were trying to do was to make AI repeatable. When someone learned how to write an awesome prompt, we wanted them to share it so the whole team could use it. Apps allowed us to do this and we started creating an app for each job that we wanted to do with AI. As people started sharing their apps, these apps would become the "go-to" source of information and AI assistance for teams.

The second thing we did was to create knowledge bases in Hoist. AI becomes significantly more useful when you can reference a lot of context. A knowledge base is like a notebook, where you can store information for a specific project or team and make it available as context to the LLM.  Knowledge bases store information securely in a vector store, optimised for use with the LLM, but without having this information trained into the LLM.

The combination of apps and knowledge bases, meant that we were now starting to see AI being used in useful, repeatable ways that could benefit the entire company. Some great examples are:

  • Marketing: We built a knowledge base with our brand guidelines, strategy, values, etc so we wrote an app that understands our USP and customers and helps us with marketing work
  • DevOps: We built a knowledge base with information specific to how we do devOps and an app to be an AI assistant for devOps
  • EOS: We use the Entrepreneurial Operating System® business framework, so we built a knowledge base of our implementation details and an app that helps people understand EOS with our specific business context
  • Weekly reports: We built a knowledge base of project and budget information and an app that helps account managers to prepare better weekly reports
  • Grant helper: We built a knowledge base with Callaghan grant guidance and an app to help us write grant applications
  • Board of directors: We built a knowledge base with our company constitution and shareholder agreement and an app that helps the board follow correct processes.
  • EndGame library: We built a knowledge base of books in our internal library and an app to help us find book recommendations based on those that we already have.

The EndGame library is a great example of how AI can change how we work. In this video, Sara shows how she was able to create this app using Hoist.

Step 4: Using AI for client work

Despite starting to see some great use of AI for our business, our journey was far from over. We wanted to start leveraging AI for our clients. When we started this journey, we were most concerned by not leaking client data into a public LLM, but now we faced the challenge that while we were wanting to run the same apps for our clients (i.e. to use AI to do similar jobs for each client) and use EndGame IP for all clients (i.e. knowhow, training, processes), we needed to make sure we weren't leaking client IP between clients and teams.

To solve this problem, we created workspaces in Hoist. By creating a separate workspace for each client, we are able to create client specific knowledge bases and a collection of apps that are useful in the context of that client. Apps can be pulled in from our app library or created just for the client.

Check out this 4 min demo of using a workspace in Hoist

Step 5: Building AI capability

It’s human nature to protect the status quo and AI threatens it more obviously than most technological developments. People are understandably concerned that AI is going to replace them. As a business, we need to adapt to using AI before it disrupts our business model. As a professional services firm, most of our revenue is billed using a “time and materials” model. How do we ensure that AI won't erode our competitive advantage and creating a race-to-the-bottom scenario?

Building AI capability takes time and experimentation. Even having Hoist in place, this is a journey that will continue throughout 2024. The important thing is to get started, get the tools in place, build AI awareness and skills and allow people to innovate in a controlled environment. We believe innovation will come from the whole team as they discover better ways to do repetitive tasks and then share that learning with the company.

To help build AI capability at EndGame, we are doing a few things:

  • Starting to automatically import useful information (such as contracts, weekly reports, project information) into client knowledge bases, so its ready for each team to leverage
  • Running a training day (we call this a Waza) to encourage people to practice their prompt writing skills
  • Attending AI events to learn and share
  • Changing how we communicate information, pointing people at a Hoist app instead of an intranet page

The Technology

During this time, we've also been upskilling our development and lifecycle teams on AI technology.  LLM and RAG technology is fairly new and to get the best results from it does require some knowhow.  We have deployed Langchain as our AI pipeline and Hoist provides a no-code layer over this AI tech stack.

Join us on this journey

We've built all of our learnings into our Hoist platform and we'd love you to join us on this journey.

Hoist is the essential professional services business AI-enabler designed to keep your client data safe and accessible, while protecting your IP and competitive advantage.

Hoist can be deployed in the cloud, or behind your own firewall and its ideal for professional services who want to combine their own IP with their clients IP and do more with AI.

Sign up for a free 30 day trial at www.hoist.io and we'd be happy to help you take the next step in your own AI journey.

Talk to us about your own AI enabled SaaS product

The work we've done on Hoist has already started to flow over into a number of other SaaS products that we are building and has enabled us to help our founders to accelerate their own products with built-in AI capability.

If you are starting your SaaS journey, then we'd love to talk to you about how an EndGame Virtual Product Team can help you get to market faster.

EndGame is software agency working with high-growth, globally minded SaaS founders. We provide a full suite of services including product strategy, development, hosting and support to de-risk your business as you grow and help you execute quickly.

We love talking SaaS, if you want to chat over your idea email me at andrew@end-game.com.

Insights delivered to your inbox weekly.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Get in touch

We’d love to see how we can work together.