What synced in-app SQLite brings to Expo apps
Development•Users••15 minutes read
Christiaan Landman
Guest Author
Combine Expo and PowerSync to bring flow state joy to full stack development.

Expo’s zero-config builds and OTA updates allow you to focus on your product, not the platform you’re building for. Unfortunately, Expo’s powers are limited to the frontend. Even a reasonably simple backend architecture can reintroduce stress and drudgery just when you got into frontend development flow (H/T to Instant for doing a great job explaining how).
The good news is that having a real-time synced database live inside your Expo app can solve most of this pain. There are a few ways to implement that. Our approach is to keep your backend database of choice and simply provide a sync engine that syncs that with in-app SQLite.
The network tarpit
The network management work required for modern apps seems simple at first but you can quickly get caught up in it. That’s why we call it the network tarpit: initially it looks very doable, but it easily becomes complex. Once you’re in, struggling to escape is a good way to sink further. It’s better to avoid the network tarpit altogether, and that’s where synced in-app SQLite comes in.
Instead of having to:
- manage network requests
- handle loading states
- implement retry logic
- orchestrate state management between your local cache and remote data,
You run local SQLite queries against data that's automatically synced in the background.
Instead of having to:
- think about online/offline states
- implement optimistic updates
- manage/wrangle API endpoints
You just write queries.
There's also a UX benefit of instant data access with zero latency. Your app feels responsive because all reads happen locally against SQLite. Users can work offline without even noticing. When they come back online, their changes sync automatically. No more loading spinners, no more "check your connection" errors, just a smooth experience that works.
In this post, we’ll walk through exactly how to combine PowerSync and Expo (and, in this case Supabase as our backend database), to bring flow state joy to full stack development.
What is PowerSync?
PowerSync embeds SQLite in your application and automatically keeps it in sync with your backend database. It’s a sync engine that consists of two components: a service that enables high-scalability partial data syncing and a set of client SDKs that manage client-side persistence, consistency, reactivity and syncing write operations back. PowerSync supports Postgres, MongoDB, and MySQL backend database.
Even though SQL is probably older than you are, it’s still the go-to mechanism for expressing most data querying needs.
For each platform that PowerSync supports – a native SDK is implemented which exposes the API for manipulating the SQLite database’s data and connecting to a PowerSync Service. Lower-level common code is packed into a core SQLite extension which is loaded into the app’s SQLite database. Local writes are queued in an upload table which is processed in your connector implementation. This allows you to write each change to your backend. Changes are synced down from the PowerSync Service to your in-app databases.
Expo applies a +10 Developer experience buff to PowerSync
Expo is the most developer-friendly way to build React Native applications, smoothing the edges of the ecosystem through SDK modules and development tools - with Expo Go being one of the standout boons. It allows you to quickly spin up a sandbox of your application without having to make a development build. This skips the native build step which means faster initial setup and iteration cycles, you don’t even need to have Xcode or Android Studio installed. We have recently added support for Expo Go through our @powersync/adapter-sql-js package built on top of SQL.js.
For PowerSync, Expo becomes particularly useful when accessing native device features like the file system, where our attachment helper package leverages the Expo FileSystem module to implement a storage adapter.
Project: What’s on your mind?
Today we’re building a thought journaling app with PowerSync, Expo, and Supabase that allows other users to react to your thoughts with emojis 😅. PowerSync will handle storing our app data locally in SQLite while automatically syncing everything with a Supabase backend.
Client-Side Implementation
For this tutorial, we'll use a self-hosted backend setup with Supabase as our database solution. Supabase is an excellent choice because it exposes a client-side SDK that allows you to upload changes directly to your database. This is all that’s needed to support multiple clients connecting and syncing data with each other. To simplify auth, we’ll make use of Supabase’s anonymous auth feature. This setup assumes you have Docker and the Supabase CLI installed.
We’re building a synced app, but it’s also possible to use PowerSync without any backend whatsoever, and add sync later - for example as part of a plan upgrade user flow. See our docs for details.
To start we can create our app project, the tailwind template works well for our needs.
- npx create-expo-app@latest -e with-tailwindcss thoughts-journal- cd thoughts-journalThere are a few dependencies we’ll need for the client.
npm install @powersync/adapter-sql-js @powersync/react-native @supabase/supabase-js
Since we want to configure syncing from the get-go, we can use a self-host helper project. It includes a docker config that runs PowerSync and Supabase locally. Degit allows us to clone the code from a repo without all the git history, we will be pulling our community repo.
npx degit powersync-community/powersync-supabase-backend-template backend
Note that all code snippets mentioned below are also available in the /backend/client directory if you want to just copy and paste entire files.
Next, let’s create the environment files for the client and backend (Our environment files assume default configuration values exposed by Supabase here and here).
The Expo app’s environment file:
# These values are fixed for a local setup with Supabase, you can use them as is.EXPO_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0EXPO_PUBLIC_SUPABASE_URL=http://localhost:54321EXPO_PUBLIC_POWERSYNC_URL=http://localhost:8080
The PowerSync Service needs to know the Supabase JWT secret:
# These values are fixed for a local setup with Supabase, you can use them as is.PS_SUPABASE_JWT_SECRET=super-secret-jwt-token-with-at-least-32-characters-long
Time to add the AppSchema, SupabaseConnector, and SystemProvider implementations for our app to src/powersync.
→ App Schema
The client-side schema will include all tables you’d like in the SQLite database embedded in your React Native application. Sync Rules defined on your PowerSync Service instance give you control over which data is replicated to those tables. It is not necessary to specify an id column for any table as that is automatically created by the SDK.
import { column, Schema, Table } from "@powersync/react-native";const thoughts = new Table({content: column.text,created_at: column.text,created_by: column.text,});const reactions = new Table({thought_id: column.text,user_id: column.text,emoji: column.text,created_at: column.text,},{ indexes: { thought: ["thought_id"] } });export const AppSchema = new Schema({thoughts,reactions,});export type Database = (typeof AppSchema)["types"];export type ThoughtRecord = Database["thoughts"];export type ReactionRecord = Database["reactions"];
→ Supabase Connector
Any PowerSync application that wants to sync data needs a backend connector that provides the connection between the PowerSync Client SDK and your backend. It has two responsibilities: authentication (through fetchCredentials()) and uploading client-side data updates (through uploadData()). Our implementation is leaning heavily on the Supabase SDK, but you could easily swap in your own integration.
For authentication, your backend application needs to generate JWTs that the PowerSync Client SDK can retrieve and use for authentication against your PowerSync Service instance. If you're using Supabase this isn't needed and we can simply re-use the Supabase JWT.
For uploading data, the SDK exposes a mechanism to process local writes. You are in complete control of what should happen to each write (they can be applied to your backend or discarded based on some condition). You might be thinking about what happens in cases where multiple clients have conflicting local writes. The standard approach is “last write wins”. We have documented how you can approach it here.
If you are new to Supabase it might seem scary to be uploading to your backend database directly, but understanding how Row Level Security works will put you at ease.
import {AbstractPowerSyncDatabase,BaseObserver,CrudEntry,PowerSyncBackendConnector,UpdateType,} from "@powersync/react-native";import { createClient, SupabaseClient } from "@supabase/supabase-js";export type SupabaseConfig = {supabaseUrl: string;supabaseAnonKey: string;powersyncUrl: string;};/// Postgres Response codes that we cannot recover from by retrying.const FATAL_RESPONSE_CODES = [// Class 22 — Data Exception// Examples include data type mismatch.new RegExp("^22...$"),// Class 23 — Integrity Constraint Violation.// Examples include NOT NULL, FOREIGN KEY and UNIQUE violations.new RegExp("^23...$"),// INSUFFICIENT PRIVILEGE - typically a row-level security violationnew RegExp("^42501$"),];export type SupabaseConnectorListener = {initialized: () => void;};export class SupabaseConnectorextends BaseObserver<SupabaseConnectorListener>implements PowerSyncBackendConnector{readonly client: SupabaseClient;readonly config: SupabaseConfig;userId?: string;constructor() {super();this.config = {supabaseUrl: process.env.EXPO_PUBLIC_SUPABASE_URL!,powersyncUrl: process.env.EXPO_PUBLIC_POWERSYNC_URL!,supabaseAnonKey: process.env.EXPO_PUBLIC_SUPABASE_ANON_KEY!,};this.client = createClient(this.config.supabaseUrl,this.config.supabaseAnonKey);this.loadUserId();}async loadUserId(): Promise<void> {let {data: { session },} = await this.client.auth.getSession();if (session == null) {const { data, error } = await this.client.auth.signInAnonymously();if (error) {throw error;}session = data.session;}if (session == null || session.user == null) {throw new Error(`Failed to get Supabase session or user`);}this.userId = session.user.id;}async fetchCredentials() {let {data: { session },} = await this.client.auth.getSession();if (session == null) {const { data, error } = await this.client.auth.signInAnonymously();if (error) {throw error;}session = data.session;}if (session == null) {throw new Error(`Failed to get Supabase session`);}return {endpoint: this.config.powersyncUrl,token: session.access_token,};}async uploadData(database: AbstractPowerSyncDatabase): Promise<void> {console.log("Uploading data to Supabase...");const transaction = await database.getNextCrudTransaction();if (!transaction) {return;}let lastOp: CrudEntry | null = null;try {// Note: If transactional consistency is important, use database functions// or edge functions to process the entire transaction in a single call.for (const op of transaction.crud) {lastOp = op;const table = this.client.from(op.table);let result: any = null;switch (op.op) {case UpdateType.PUT:const record = { ...op.opData, id: op.id };result = await table.upsert(record);break;case UpdateType.PATCH:result = await table.update(op.opData).eq("id", op.id);break;case UpdateType.DELETE:result = await table.delete().eq("id", op.id);break;}if (result.error) {console.error(result.error);result.error.message = `Could not ${op.op} data to Supabase error: ${JSON.stringify(result)}`;throw result.error;}}await transaction.complete();} catch (ex: any) {console.debug(ex);if (typeof ex.code == "string" &&FATAL_RESPONSE_CODES.some((regex) => regex.test(ex.code))) {/*** Instead of blocking the queue with these errors,* discard the (rest of the) transaction.** Note that these errors typically indicate a bug in the application.* If protecting against data loss is important, save the failing records* elsewhere instead of discarding, and/or notify the user.*/console.error("Data upload error - discarding:", lastOp, ex);await transaction.complete();} else {// Error may be retryable - e.g. network error or temporary server error.// Throwing an error here causes this call to be retried after a delay.throw ex;}}}}
→ System Provider
This provider creates our PowerSync client and configures it with the app schema and database adapter. To use PowerSync with Expo Go we use a JS-only adapter (imported from powersync/adapter-sql-js). This setup is great for development but we recommend switching to our OP-sqlite or RNQS adapters when making production builds as they give substantially better performance.
import { SupabaseConnector } from './SupabaseConnector';import { SQLJSOpenFactory } from '@powersync/adapter-sql-js';import { createBaseLogger, LogLevel, PowerSyncContext, PowerSyncDatabase } from '@powersync/react-native';import { AppSchema } from './AppSchema';import React, { PropsWithChildren } from 'react';const SupabaseContext = React.createContext<SupabaseConnector | null>(null);export const useSupabase = () => React.useContext(SupabaseContext);export const powerSync = new PowerSyncDatabase({schema: AppSchema,database: new SQLJSOpenFactory({dbFilename: 'app.db'})});export const connector = new SupabaseConnector();powerSync.connect(connector);const logger = createBaseLogger();logger.useDefaults();logger.setLevel(LogLevel.DEBUG);export const SystemProvider = ({ children }: PropsWithChildren) => {return (<PowerSyncContext.Provider value={powerSync as any}><SupabaseContext.Provider value={connector}>{children}</SupabaseContext.Provider></PowerSyncContext.Provider>);};export default SystemProvider;
Import and use the SystemProvider in the root layout.
import SystemProvider from "@/powersync/SystemProvider";import "../global.css";import { Slot } from "expo-router";export default function Layout() {return (<SystemProvider><Slot /></SystemProvider>);}
Replace the entire index.tsx file’s contents. For reactivity, we use the useQuery() hook from @powersync/react-native that executes the SQL read query, and re-executes whenever the underlying tables of the query have changes.
For write queries, we access the PowerSync client with usePowerSync() and call execute() on it to insert and delete rows.
import { usePowerSync, useQuery } from "@powersync/react-native";import { ThoughtRecord } from "@/powersync/AppSchema";import { useSupabase } from "@/powersync/SystemProvider";import React, { useState } from "react";import {View,Text,ScrollView,TouchableOpacity,TextInput,Modal,Dimensions} from "react-native";import { useSafeAreaInsets } from "react-native-safe-area-context";const { width } = Dimensions.get('window');type EmojiCounter = { emoji: string, count: number }const commonEmojis = ["❤️", "👍", "😂", "😊", "🔥", "💯", "🚀", "💡", "🌟", "👏"];function ThoughtReactions({ thoughtId }) {const [showEmojiPicker, setShowEmojiPicker] = useState(false);const powersync = usePowerSync();const connector = useSupabase();// Query reactions grouped by emoji with countsconst { data: reactionGroups } = useQuery<EmojiCounter>(/* sql */ `SELECTemoji,COUNT(*) as countFROMreactionsWHEREthought_id = ?GROUP BYemoji`,[thoughtId]);const handleAddReaction = async (emoji: string) => {try {await powersync.execute(/* sql */ `INSERT INTOreactions (id, thought_id, user_id, emoji, created_at)VALUES(uuid(), ?, ?, ?, datetime())`,[thoughtId, connector.userId, emoji]);setShowEmojiPicker(false);} catch (error) {console.error("Error adding reaction:", error);}};return (<>{/* Reactions Section */}<View className="flex-row justify-between items-center"><View className="flex-row flex-wrap flex-1">{reactionGroups.map((group) => (<View key={group.emoji} className="flex-row items-center bg-gray-100 rounded-full px-3 py-1.5 mr-2 mb-1"><Text className="text-base">{group.emoji}</Text>{group.count > 1 ? (<Text className="text-sm text-gray-600 ml-1">{group.count}</Text>) : null}</View>))}</View>{/* Add Reaction Button */}<TouchableOpacityclassName="flex-row items-center bg-gray-100 rounded-full px-3 py-1.5"onPress={() => setShowEmojiPicker(!showEmojiPicker)}><Text className="text-sm text-gray-600">😊 React</Text></TouchableOpacity></View>{/* Emoji Picker */}{showEmojiPicker && (<View className="mt-3 bg-white rounded-xl p-3 border border-gray-200"><View className="flex-row flex-wrap justify-between">{commonEmojis.map((emoji) => (<TouchableOpacitykey={emoji}className="justify-center items-center rounded-lg mb-1 h-10"style={{ width: width * 0.15 }}onPress={() => handleAddReaction(emoji)}><Text className="text-2xl">{emoji}</Text></TouchableOpacity>))}</View></View>)}</>);}export default function ThoughtsApp() {const { top } = useSafeAreaInsets();const [showNewThought, setShowNewThought] = useState(false);const [newThoughtContent, setNewThoughtContent] = useState("");const powersync = usePowerSync();const connector = useSupabase();// Query all thoughtsconst { data: thoughts } = useQuery<ThoughtRecord>(/* sql */ `SELECT*FROMthoughts`);const handleAddThought = async () => {if (newThoughtContent.trim()) {try {await powersync.execute(/* sql */ `INSERT INTOthoughts (id, content, created_at, created_by)VALUES(uuid(), ?, datetime(), ?)`,[newThoughtContent.trim(), connector.userId]);setNewThoughtContent("");setShowNewThought(false);} catch (error) {console.error("Error adding thought:", error);}}};const handleDeleteThought = async (thoughtId: string) => {try {// Delete all reactions for this thought firstawait powersync.execute(/* sql */ `DELETE FROM reactionsWHEREthought_id = ?`,[thoughtId]);// Then delete the thoughtawait powersync.execute(/* sql */ `DELETE FROM thoughtsWHEREid = ?`,[thoughtId]);} catch (error) {console.error("Error deleting thought:", error);}};return (<View className="flex-1 bg-gray-50">{/* Header */}<View style={{ paddingTop: top }} className="bg-white border-b border-gray-200 ios:shadow android:elevation-2"><View className="px-4 py-4"><Text className="text-2xl font-bold text-gray-900">Thoughts</Text></View></View>{/* Main Content */}<ScrollView className="flex-1" contentContainerClassName="p-4 pb-24">{thoughts.map((thought) => (<View key={thought.id} className="bg-white rounded-xl p-5 mb-4 ios:shadow android:elevation-2 border border-gray-200">{/* Thought Content */}<View className="mb-4"><View className="flex-row justify-between items-start mb-2"><Text className="text-lg leading-7 text-gray-900 flex-1 mr-2">{thought.content}</Text>{thought.created_by === connector.userId && (<TouchableOpacityclassName="w-8 h-8 bg-red-100 rounded-full justify-center items-center"onPress={() => handleDeleteThought(thought.id)}><Text className="text-red-600 text-lg font-bold">-</Text></TouchableOpacity>)}</View><Text className="text-sm text-gray-500"> Created: {thought.created_at}</Text></View>{/* Reactions Component */}<ThoughtReactions thoughtId={thought.id} /></View>))}</ScrollView>{/* FAB Button */}<TouchableOpacityclassName="absolute bottom-6 right-6 w-14 h-14 bg-blue-500 rounded-full justify-center items-center ios:shadow-lg android:elevation-8"onPress={() => setShowNewThought(true)}><Text className="text-2xl text-white font-bold">+</Text></TouchableOpacity>{/* New Thought Modal */}<Modalvisible={showNewThought}animationType="slide"transparent={true}onRequestClose={() => setShowNewThought(false)}><View className="flex-1 bg-black/50 justify-center items-center p-4"><View className="bg-white rounded-xl p-6 w-full max-w-sm"><Text className="text-lg font-semibold mb-4 text-gray-900">Share a thought</Text><TextInputclassName="border border-gray-300 rounded-lg p-3 text-base mb-4"style={{ height: 120, textAlignVertical: 'top' }}value={newThoughtContent}onChangeText={setNewThoughtContent}placeholder="What's on your mind?"multilineautoFocus/><View className="flex-row justify-end gap-2"><TouchableOpacityclassName="px-4 py-2 rounded-lg"onPress={() => setShowNewThought(false)}><Text className="text-gray-600 text-base">Cancel</Text></TouchableOpacity><TouchableOpacityclassName={`px-4 py-2 rounded-lg ${!newThoughtContent.trim() ? 'bg-gray-400' : 'bg-blue-500'}`}onPress={handleAddThought}disabled={!newThoughtContent.trim()}><Text className="text-white text-base font-semibold">Share</Text></TouchableOpacity></View></View></View></Modal></View>);}
At this point we could run the client, but without a backend nothing is going to work as our implementation depends on at the very least getting a user ID from Supabase.
.env.localbackend/└── .env.localsrc/├── app/│ ├── _layout.tsx│ └── index.tsx└── powersync/├── AppSchema.ts├── SupabaseConnector.ts└── SystemProvider.tsx
Configuring the Sync Backend
With our client setup completed, the next step is to spin up our backend. We’ll start by navigating to the backend directory we pulled with degit and get to booting Supabase through its CLI.
- cd backendsupabase start
The backend project includes a supabase directory which contains configuration for our Supabase instance. The most important entries are:
- The migration script which sets up our database schema and publication (needed to get changes from the source database to PowerSync). The schema consists of two tables: one for thoughts and another for reactions, where each reaction has a relationship to a thought.
- The base
config.toml, with the only alteration being thatenable_anonymous_sign_inshas been enabled (this simplifies our demo, but you could let users sign-in instead). - Seed data script which gives us some test data to start with.
Starting Supabase will output a bunch of useful key-values and local URLs (like the Supabase studio URL).
We’re now all set to start the PowerSync Service (alternatively run backend/docker-start.sh):
- docker run \
-p 8080:8080 \
-e POWERSYNC_CONFIG_B64=$(base64 -i ./powersync.yaml) \
-e POWERSYNC_SYNC_RULES_B64=$(base64 -i ./sync-rules.yaml) \
--env-file ./.env.local \
--network supabase_network_expo-supabase-todo \
--name expo-powersync journeyapps/powersync-service:latestThis ensures that we have a PowerSync Service running with the sync rules specified in backend/sync-rules.yaml, and we’re using the Supabase database as both our source database and our database for the PowerSync Service’s sync bucket storage.
Sync Rules use a SQL-like syntax to tell the PowerSync Service how to handle data and changes from the source database, put them in the service’s bucket storage, and then sync to connecting clients. It’s often the case that the source database’s schema, the sync rules, and your app’s PowerSync schema have very similar shapes.
For this tutorial, we want to sync all thoughts and reactions to all users.
bucket_definitions:user_lists:data:- SELECT * FROM thoughts- SELECT * FROM reactions
This is the topology we end up with:
How does a Syncosaurus travel? Local-first class.
Ready, Set, Sync!
Finally we get to run our app:
npm run start
Pick a simulator (press i to open the Expo Go app in your iOS simulator, shift + i to open multiple simulators)
Note: For Android emulators you may need to forward ports so that the emulator can reach local ports.
adb reverse tcp:8080 tcp:8080adb reverse tcp:54321 tcp:54321
Opening Supabase studio at http://127.0.0.1:54323/project/default, you can see changes made in your app will sync to Supabase, and changes made in Supabase will reflect in your app.
Oh no, I lost my internet connection!?
With everything working, we can kill the PowerSync Service to simulate an offline scenario between the devices. After the service is down, add some thoughts and reactions to both devices. Your app will keep updating with the changes made locally.
We can then restart the service (docker restart expo-powersync) to see the both apps catch up on everything that has happened in our mock network blip.
Bonus segments
Expo file system persister
The sql.js database adapter uses an in-memory persister by default, restarting your development app means your local data will disappear and have to be synced down again. We can specify a file persister with the help of the Expo File System module so that changes are written and read from disk.
Just slightly alter the SystemProvider.
import { SupabaseConnector } from './SupabaseConnector';import { SQLJSOpenFactory, SQLJSPersister } from '@powersync/adapter-sql-js'import { createBaseLogger, LogLevel, PowerSyncContext, PowerSyncDatabase } from '@powersync/react-native';import { AppSchema } from './AppSchema';import * as FileSystem from 'expo-file-system';import React, { PropsWithChildren } from 'react';const SupabaseContext = React.createContext<SupabaseConnector | null>(null);export const useSupabase = () => React.useContext(SupabaseContext);const createSQLJSPersister = (dbFilename: string): SQLJSPersister => {const dbPath = `${FileSystem.documentDirectory}${dbFilename}`;return {readFile: async (): Promise<ArrayLike<number> | Buffer | null> => {try {const fileInfo = await FileSystem.getInfoAsync(dbPath);if (!fileInfo.exists) {return null;}const result = await FileSystem.readAsStringAsync(dbPath, {encoding: FileSystem.EncodingType.Base64});const binary = atob(result);const bytes = new Uint8Array(binary.length);for (let i = 0; i < binary.length; i++) {bytes[i] = binary.charCodeAt(i);}return bytes;} catch (error) {console.error('Error reading database file:', error);return null;}},writeFile: async (data: ArrayLike<number> | Buffer): Promise<void> => {try {const uint8Array = new Uint8Array(data);const binary = Array.from(uint8Array, (byte) => String.fromCharCode(byte)).join('');const base64 = btoa(binary);await FileSystem.writeAsStringAsync(dbPath, base64, {encoding: FileSystem.EncodingType.Base64});} catch (error) {console.error('Error writing database file:', error);throw error;}}};};export const powerSync = new PowerSyncDatabase({schema: AppSchema,database: new SQLJSOpenFactory({dbFilename: 'app.db',persister: createSQLJSPersister('app.db')})});export const connector = new SupabaseConnector();powerSync.connect(connector);const logger = createBaseLogger();logger.useDefaults();logger.setLevel(LogLevel.DEBUG);export const SystemProvider = ({ children }: PropsWithChildren) => {return (<PowerSyncContext.Provider value={powerSync}><SupabaseContext.Provider value={connector}>{children}</SupabaseContext.Provider></PowerSyncContext.Provider>);};export default SystemProvider;
Raw SQL or use an ORM?
For simplicity this tutorial covered usage with raw SQL queries, however we support two well known DB abstractions for those who prefer type-safety and ease of use. The below examples show how the src/app/index.tsx would differ (assuming appropriate setup).
import { usePowerSync, useQuery } from "@powersync/react-native";import { sql } from 'kysely';// watch queryconst query = db.selectFrom('thoughts').selectAll();const { data: thoughts } = useQuery(query);// insert queryawait db.insertInto('thoughts').values({id: sql`uuid()`,content: newThoughtContent.trim(),created_at: sql`datetime()`,created_by: connector.userId}).execute();
import { toCompilableQuery } from "@powersync/drizzle-driver";import { usePowerSync, useQuery } from "@powersync/react-native";import { sql } from 'drizzle-orm';// watch queryconst query = db.select().from(thoughts);const { data: thoughts } = useQuery(toCompilableQuery(query));// insert queryawait db.insert(thoughts).values({id: sql`uuid()`,content: newThoughtContent.trim(),created_at: sql`datetime()`,created_by: connector.userId});
Do these queries seem trivial?
We tried keeping the queries simple today, but you are only limited by the bounds of SQLite. For example we have spread our two queries for fetching the thoughts and reactions across components, but if we wanted to run a single query in a higher level component and trickle the data down we could have done so with a join query.
As another example, what if we wanted to rate the thought authors:
WITH author_metrics AS (SELECTt.created_by,COUNT(t.id) as total_thoughts,COUNT(r.id) as total_reactions_received,COUNT(DISTINCT r.user_id) as unique_fans,COUNT(DISTINCT r.emoji) as emoji_types_received,ROUND(COUNT(r.id) * 1.0 / COUNT(t.id), 2) as reactions_per_thought,-- Time-based metricsCOUNT(CASE WHEN t.created_at > datetime('now', '-7 days') THEN 1 END) as thoughts_last_week,COUNT(CASE WHEN r.created_at > datetime('now', '-7 days') THEN 1 END) as reactions_last_week,-- Best performing thoughtMAX((SELECT COUNT(*) FROM reactions r2 WHERE r2.thought_id = t.id)) as best_thought_reactionsFROM thoughts tLEFT JOIN reactions r ON t.id = r.thought_idGROUP BY t.created_by),author_rankings AS (SELECT*,RANK() OVER (ORDER BY reactions_per_thought DESC) as engagement_rank,RANK() OVER (ORDER BY total_thoughts DESC) as volume_rank,RANK() OVER (ORDER BY unique_fans DESC) as influence_rank,-- Overall author score(reactions_per_thought * 2 + unique_fans * 0.5 + emoji_types_received * 0.3) as author_scoreFROM author_metricsWHERE total_thoughts >= 3 -- Only authors with meaningful activity)SELECTcreated_by,total_thoughts,total_reactions_received,reactions_per_thought,unique_fans,best_thought_reactions,thoughts_last_week,reactions_last_week,ROUND(author_score, 2) as author_score,'Top ' || engagement_rank || ' Engagement' as engagement_ranking,'Top ' || influence_rank || ' Influence' as influence_ranking,CASEWHEN author_score >= 10 THEN '🔥 Superstar'WHEN author_score >= 5 THEN '⭐ Rising Star'WHEN author_score >= 3 THEN '👍 Solid Contributor'ELSE '🌱 Getting Started'END as author_tierFROM author_rankingsORDER BY author_score DESCLIMIT 10;
Which gives us this nice report.
[{"author_score": 5.2,"author_tier": "⭐ Rising Star","best_thought_reactions": 5,"created_by": "11111111-2222-4333-8444-555555555555","engagement_ranking": "Top 1 Engagement","influence_ranking": "Top 1 Influence","reactions_last_week": 2,"reactions_per_thought": 1,"thoughts_last_week": 0,"total_reactions_received": 5,"total_thoughts": 5,"unique_fans": 4},{"author_score": 4.1,"author_tier": "👍 Solid Contributor","best_thought_reactions": 3,"created_by": "33333333-4444-4555-a666-777777777777","engagement_ranking": "Top 1 Engagement","influence_ranking": "Top 2 Influence","reactions_last_week": 0,"reactions_per_thought": 1,"thoughts_last_week": 0,"total_reactions_received": 3,"total_thoughts": 3,"unique_fans": 3}]
So what have we really won here?
Is this a lot of work for no real benefit? No, because here’s the world you live in now: think about what steps you would take to implement ordering functionality if you had a conventional backend. You would have to implement sorting parameters in your REST endpoint and sort the response on your backend before you could update your API calls to include sort parameters in your client.
With a local database, you can simply add ORDER BY to the query.
useQuery(`SELECT * FROM thoughts`);// turns intouseQuery(`SELECT * FROM thoughts ORDER BY created_at DESC`);
SELECT * FROM journey WHERE status = 'THE END'
We covered a lot of powerful features, but we hope the benefits and ease of use of PowerSync shined through.
For more about PowerSync see our documentation. For more info on using PowerSync with Expo see our Expo integration guide. Got questions? You can find our contact information here.


