Skip to Main content

Tauri App: One Frontend Codebase for Native and Web

Keeping one frontend codebase for Tauri and the web sounds tidy until runtime differences start sneaking into every feature. This post uses plugin-store as the example, but the same IPC mocking pattern can be adapted to other plugins and commands too.

Most cross-platform frontend codebases do not really stay single-codebase for long. They start shared, then platform checks spread everywhere, storage splits in two, and eventually the web app and the native app are only superficially related.

I wanted the opposite.

I have a frontend that needs to run in two very different environments:

  • as a Tauri app on desktop and mobile, where the browser runtime can call Rust commands and Tauri plugins
  • as a plain web app, where there is no Rust process and no native Tauri runtime

The goal is not just code sharing at the component level. The goal is to keep one actual frontend codebase, with the same screens, the same business logic, and the same storage-facing API, while only swapping the platform behavior underneath.

For this example, that also means being honest about persistence. In native builds, I still use @tauri-apps/plugin-store because it can persist to the filesystem, which is the reliability level I actually want there. In the web build, I do not have that option, so the job is to find the best browser-side fallback that still saves data for long enough to be useful.

This post shows the pattern I use for that.

The short version is:

  1. The app always talks to Tauri APIs from the frontend.
  2. In native Tauri builds, those calls go to Rust and Tauri plugins normally.
  3. In the web build, I enable Tauri's builtin IPC mocking and intercept the same calls in JavaScript.
  4. The mock implementation reproduces just enough plugin behavior for the web app to keep working.

That lets the rest of the application stay almost completely unaware of whether it is running inside Tauri or inside a normal browser.

Architecture at a Glance

This is the overall shape of the setup:

flowchart TD
  A["Shared React / Next frontend"] --> B["TauriProvider"]
  B --> C{"Real Tauri runtime available?"}
  C -->|Yes| D["Use normal Tauri IPC"]
  D --> E["@tauri-apps/plugin-store"]
  E --> F["Rust + Tauri plugin backend"]
  C -->|No| G["Install mockIPC(ipcCallback)"]
  G --> E
  E --> H["JavaScript IPC mock"]
  H --> I["IndexedDB via idb-keyval"]
  B --> J["TauriContext"]
  J --> K["Feature modules use store and isMocked"]

Why This Pattern Is Useful

If you build both a native app and a web app, the easy way is often to fork logic early:

  • one storage layer for Tauri
  • another storage layer for the browser
  • different feature flags scattered through the UI
  • lots of if (isWeb) and if (isDesktop) checks

That works at first, but it pushes platform concerns into the entire app. The code becomes harder to reason about because every feature starts caring about where it is running.

I prefer the opposite direction:

  • keep platform branching at the boundary
  • expose one frontend API to the app
  • make the fallback implementation behave like the native one as closely as practical

In my case, the most important boundary happens to be the Tauri store plugin. A lot of application state depends on it, so it makes a good example. But the larger pattern is not really about storage. It is about picking a plugin boundary the frontend already trusts, then preserving that contract across native and web runtimes.

The Core Idea

Tauri exposes a mock API for IPC calls. In a real Tauri runtime, frontend calls go through the native bridge. In a plain browser, I can install a mock handler and intercept those calls in JavaScript instead.

That gives me a clean split:

  • native environment: use real Tauri behavior
  • web environment: emulate the subset of Tauri behavior my app depends on

For the plugin-store version of this setup, these two files are the center of the implementation:

  • providers/tauri-provider.tsx: decides whether to install mocks and provides the loaded store to the app
  • providers/tauri-mock.ts: implements the mocked IPC behavior for the store plugin example used in this post

Here is the exact request flow that matters most in practice:

sequenceDiagram
  participant UI as Feature code
  participant Provider as TauriProvider
  participant Store as plugin-store client
  participant Native as Native Tauri runtime
  participant Mock as ipcCallback mock
  participant IDB as IndexedDB

  UI->>Provider: read store from context
  Provider->>Store: load('store.json')
  alt Native desktop/mobile app
    Store->>Native: plugin:store|get/set/keys/delete
    Native-->>Store: real plugin response
  else Plain web app
    Provider->>Mock: mockIPC(ipcCallback)
    Store->>Mock: plugin:store|get/set/keys/delete
    Mock->>IDB: get / set / keys / del
    IDB-->>Mock: browser persistence result
    Mock-->>Store: plugin-compatible response
  end
  Store-->>UI: Store instance behaves the same

The Provider: Real Tauri on Native, Mocked Tauri on Web

Here is the provider:

/* eslint-disable no-console */
'use client';

import { load, Store } from '@tauri-apps/plugin-store';
import { useEffect, useState } from 'react';
import { clearMocks, mockIPC } from '@tauri-apps/api/mocks';

import { TauriContext } from '@/contexts/tauri-context';

import { ipcCallback } from './tauri-mock';

export default function TauriProvider({
  children,
}: {
  children: React.ReactNode;
}) {
  const [store, setStore] = useState<Store | null>(null);
  const [isMocked, setIsMocked] = useState(false);

  useEffect(() => {
    const initializeStore = async () => {
      if (
        process.env.NEXT_PUBLIC_TAURI_MOCKED ||
        !window ||
        !('__TAURI__' in window)
      ) {
        console.log('Setting up Tauri IPC mocks');

        setIsMocked(true);
        mockIPC(ipcCallback);
      }

      const loadedStore = await load('store.json', {
        defaults: {},
        autoSave: false,
      });

      setStore(loadedStore);
    };

    initializeStore();

    return () => {
      clearMocks();
      console.log('Cleared Tauri IPC mocks');
    };
  }, []);

  const tauriContextValue = {
    store,
    isMocked,
  };

  return (
    <TauriContext.Provider value={tauriContextValue}>
      {children}
    </TauriContext.Provider>
  );
}

There are a few important decisions in this file.

1. The App Always Loads the Same Store API

The provider always calls:

const loadedStore = await load('store.json', {
  defaults: {},
  autoSave: false,
});

This matters because the rest of the app does not need a separate abstraction such as loadWebStore() versus loadTauriStore().

It just consumes a Store instance from context.

In other words, the application code does not ask:

  • am I in a browser?
  • am I in Tauri?
  • should I use IndexedDB here?

It simply asks for the store.

That is the main design win in this example, and the same principle carries over to other plugins: keep the frontend-facing contract stable, and move the platform differences behind it.

2. Environment Detection Happens Once

This block decides whether to install the mock handler:

if (
  process.env.NEXT_PUBLIC_TAURI_MOCKED ||
  !window ||
  !('__TAURI__' in window)
) {
  setIsMocked(true);
  mockIPC(ipcCallback);
}

Conceptually, the rule is:

  • if a real Tauri runtime is available, do nothing and let Tauri handle IPC normally
  • if not, install a JavaScript IPC interceptor

The optional environment flag is useful because it gives me a manual override. Even if I am inside a Tauri-capable environment, I can still force the mocked path for testing or debugging.

This is especially helpful when I want to validate browser-compatible behavior without changing feature code.

This is also the reason I like to make the environment decision in one place only. Once that choice is made, everything downstream can stay boring.

3. The Mock Is Installed Before Store Usage

The order is important.

First I register the IPC mock:

mockIPC(ipcCallback);

Then I call load() from @tauri-apps/plugin-store.

That means when the plugin-store package issues its internal IPC calls, they already have somewhere to go. In native Tauri, they go through the real bridge. In the browser, they are intercepted by ipcCallback.

The calling code is the same in both cases.

4. The Provider Exposes Two Things Only

The context value is intentionally small:

const tauriContextValue = {
  store,
  isMocked,
};

That is enough for the app to:

  • use the shared store API
  • optionally adjust a few UI affordances when native-only features are unavailable

That second part is important. Some actions genuinely only make sense in native mode. For example, opening a store file location on disk is a native concern. The app can hide or disable that action when isMocked is true, while keeping the rest of the feature working.

The Mock: Reimplementing the Store Plugin Boundary in the Browser

Here is the mock handler:

import { get, set, del, keys } from 'idb-keyval';

const STORE_PREFIX = process.env.NEXT_PUBLIC_STORE_MOCK_PREFIX || 'store::';

const storeKey = (key: string) => STORE_PREFIX + key;

/* eslint-disable no-console */
export const ipcCallback = async (cmd: string, payload: any) => {
  console.log(`Mocked IPC call: ${cmd}`, payload);

  if (cmd === 'plugin:store|get') {
    if (payload && 'key' in payload) {
      const { key } = payload as { key: string };
      const data = await get(storeKey(key));

      if (data) {
        return [data, true];
      } else {
        return [null, false];
      }
    }
  } else if (cmd === 'plugin:store|set') {
    if (payload && 'key' in payload && 'value' in payload) {
      const { key, value } = payload as { key: string; value: any };

      await set(storeKey(key), value);

      return null;
    }
  } else if (cmd === 'plugin:store|keys') {
    const allKeys = await keys();
    const filtered = allKeys
      .filter((key) => typeof key === 'string' && key.startsWith(STORE_PREFIX))
      .map((key) => (key as string).substring(STORE_PREFIX.length));

    return filtered;
  } else if (cmd === 'plugin:store|delete') {
    if (payload && 'key' in payload) {
      const { key } = payload as { key: string };

      await del(storeKey(key));
    }
  }

  return null;
};

This file is doing something very specific: it is not mocking my app. It is mocking the Tauri plugin protocol that my app already depends on.

That distinction matters, and it is also the part that generalizes cleanly beyond storage.

Instead of rewriting the application to speak some browser-specific storage API, I emulate the plugin commands that @tauri-apps/plugin-store expects. If the example were a different plugin, the same idea would still apply: preserve the plugin contract, then adapt the implementation underneath it.

Why IndexedDB Is a Good Fit Here

For the browser-side implementation, I use idb-keyval, which is a thin wrapper around IndexedDB.

That choice is specifically for the web fallback. In the native app, I still use the store plugin instead of talking to the embedded browser's IndexedDB directly, because file-backed persistence is the point of using the plugin in the first place. On the web, there is no equivalent filesystem-backed path available through Tauri, so the question changes from "what is identical?" to "what is the best persistence option that still serves the purpose well enough?"

That is a good match for this use case because:

  • it is asynchronous, like the native plugin boundary already is
  • it persists across page reloads
  • it is available in normal browsers without extra infrastructure
  • it is simple enough that the mock stays small

I do not need to recreate every detail of the Tauri store plugin. I only need to implement the commands my app actually uses.

The Command Mapping

The mock currently handles four plugin commands:

  • plugin:store|get
  • plugin:store|set
  • plugin:store|keys
  • plugin:store|delete

That command list effectively defines the storage contract that the rest of the frontend relies on.

plugin:store|get

if (cmd === 'plugin:store|get') {
  if (payload && 'key' in payload) {
    const { key } = payload as { key: string };
    const data = await get(storeKey(key));

    if (data) {
      return [data, true];
    } else {
      return [null, false];
    }
  }
}

The interesting part here is that the mock returns the shape expected by the plugin caller, not just the raw value.

That is exactly what makes this pattern work well. The mock should imitate the native protocol, not invent a more convenient browser-only protocol.

plugin:store|set

} else if (cmd === 'plugin:store|set') {
  if (payload && 'key' in payload && 'value' in payload) {
    const { key, value } = payload as { key: string; value: any };

    await set(storeKey(key), value);

    return null;
  }
}

This maps cleanly to IndexedDB. The plugin command carries a key and a value. The browser mock persists it using idb-keyval.

plugin:store|keys

} else if (cmd === 'plugin:store|keys') {
  const allKeys = await keys();
  const filtered = allKeys
    .filter((key) => typeof key === 'string' && key.startsWith(STORE_PREFIX))
    .map((key) => (key as string).substring(STORE_PREFIX.length));

  return filtered;
}

This is where the key prefix matters.

Since IndexedDB may contain other keys, I namespace everything for the mocked store and strip the prefix before returning results. That way the rest of the app sees the same logical store keys regardless of platform.

plugin:store|delete

} else if (cmd === 'plugin:store|delete') {
  if (payload && 'key' in payload) {
    const { key } = payload as { key: string };

    await del(storeKey(key));
  }
}

Again, the goal is not sophistication. The goal is parity at the API boundary.

Why the Prefix Is More Important Than It Looks

This line is small but important:

const STORE_PREFIX = process.env.NEXT_PUBLIC_STORE_MOCK_PREFIX || 'store::';

Without namespacing, the browser-backed mock can easily collide with other data living in IndexedDB.

By keeping a prefix:

  • the mock store stays isolated
  • multiple apps can coexist more safely in the same browser profile
  • local debugging becomes easier because the stored keys are recognizable
  • future migrations are easier because the storage domain is explicit

Using an environment variable for the prefix is also a good small flexibility point. It lets me isolate environments such as local development, staging, or different app variants without changing code.

What the Rest of the App Gets From This

Once the provider exposes a loaded Store, most of the application does not care where that store came from.

A typical consumer looks like this in my codebase:

const { store, isMocked } = useTauriContext();

Then it can do things like:

store.keys().then((keys) => {
  const keyItems = keys.map((key) => ({ key }));
  keyItems.sort((a, b) => a.key.localeCompare(b.key));
  setStoreKeys(keyItems);
});

or:

store.get(storeKey).then((value) => {
  if (value) {
    setStoreEntry({ key: storeKey, value });
  }
});

That code is not web-specific and not desktop-specific.

It is just application code.

That is the outcome I want: platform handling stays near the provider and the mock, not inside every feature module.

Native-Only Actions Can Still Exist

This pattern does not mean every feature must behave identically everywhere.

Some actions are inherently native. For example, one part of my app can ask Rust to open the on-disk store location:

const handleOpen = async () => {
  await invoke('open_store_location');
};

That should only be available when a real Tauri runtime exists.

So the UI can simply gate that action:

{!isMocked ? <Button onPress={handleOpen}>Open</Button> : null}

That is a much better split than branching the whole feature. The feature still works in both environments. Only the truly native affordance disappears on the web.

Why I Prefer Mocking IPC Instead of Forking the Store Layer

Another way to solve this would be to write a custom storage abstraction and inject either:

  • a Tauri implementation
  • or an IndexedDB implementation

That approach is valid, but I do not think it is the best fit here.

The frontend already depends on Tauri plugin-store semantics. The plugin package already gives me a stable API. If I fork that abstraction too early, I end up maintaining my own compatibility layer above an existing compatibility layer.

By mocking IPC instead:

  • I keep using the official plugin package
  • I avoid duplicating store-facing logic
  • I reuse the same frontend call paths in native and web modes
  • I test more realistic behavior, because the browser path still flows through the same plugin interface

This is the key architectural choice in the whole setup.

If I had to summarize the philosophy in one sentence, it would be this: keep the frontend contract stable and swap the runtime implementation underneath it.

Practical Benefits

After using this pattern for a while, the biggest benefit is that feature code stays pleasantly ordinary. In this post the example is the store API, but the benefit is broader than storage: components keep talking to the same frontend-facing contract regardless of whether the app is running in a browser, on desktop, or on mobile. That means most of the UI does not need to carry little platform anxieties around with it, which is nice for the code and also for the human reading it three months later.

It also makes day-to-day development easier. I can work in a normal browser, keep the same storage-oriented flows that the native app uses, and still get something much more useful than a fake demo shell. The web version can preserve user state and behave like a real application, even if a few native-only actions are deliberately absent.

The other practical win is that adoption can stay incremental. I do not need to solve every Tauri plugin on day one or design a grand abstraction framework before shipping anything. I can mock the commands I actually use, keep the boundary small, and extend it only when the product gives me a reason to.

The Constraints and Tradeoffs

This pattern is useful, but it is not magic.

The first tradeoff is straightforward: you only get the behavior you are willing to mock. If the app starts depending on more store-plugin commands, the browser implementation has to grow with it. That is not a design failure, but it is real maintenance work, and it is better to admit that up front than pretend the browser fallback will maintain itself out of politeness.

There is also the fact that native and browser persistence are only similar, not identical. On native, the store lives behind a plugin that can persist to the filesystem, which is exactly why I keep using it there instead of dropping down to the embedded browser's IndexedDB. On the web, that filesystem-backed option does not exist, so the fallback is IndexedDB because it is the most practical way to keep data around for long enough to be useful. The API can look the same while the operational characteristics underneath are still different, so I would not oversell the parity. Good enough for shared frontend logic is usually the goal here, not perfect metaphysical sameness.

The last tradeoff is about honesty. Some features really are native-only, especially anything involving filesystem access, shell integration, deep OS hooks, or direct Rust commands. Those should be hidden, degraded, or replaced intentionally in the web build instead of being awkwardly forced through a fake abstraction. And if the mock drifts too far away from the real native contract, it can create false confidence, so the safest version of this pattern is the boring one: keep the mock small, focused, and protocol-accurate.

A Good Rule of Thumb

Mock at the IPC boundary, not at the feature boundary.

That usually gives the best balance: enough realism to keep shared code honest, enough flexibility to run in the browser, and far less platform branching leaking into application logic.

How I Would Extend This Pattern

The same approach works beyond the store plugin used in this article.

If the web build needs to support more Tauri-powered features, I would add browser implementations for the exact commands that matter most.

Examples:

  • a browser fallback for file import and export
  • a mock layer for selected invoke() commands
  • capability flags exposed through the same context
  • web-safe replacements for native dialogs or open-file flows

The important part is to keep the contract stable from the app's point of view.

When This Approach Is Worth It

I would use this pattern when the native and web versions still share most screens and business logic, when the frontend already depends on Tauri APIs or plugins, and when the browser build is meant to be a real usable app rather than a marketing shell wearing an app costume.

It also helps when the number of native-only features is limited enough that selective mocking stays practical. If the web and native products are fundamentally different applications that only happen to share some components, I would probably not bother forcing this approach onto them.

Final Takeaway

The important trick is not "make Tauri work on the web".

The important trick is narrower and more useful:

Use Tauri's builtin mock support to preserve the same frontend contract across native and web runtimes.

In this setup:

  • native Tauri builds use the real Rust-backed path
  • web builds intercept IPC in JavaScript
  • the app still talks to the same plugin API it was already built around
  • only a small platform boundary knows the difference

That is what makes a single frontend codebase practical here.

Instead of designing two frontend architectures and trying to keep them in sync, I keep one frontend architecture and swap the runtime behavior underneath it.

For this class of app, that is the simpler and more durable choice.