Emdash source with visual editor image upload fix

Fixes:
1. media.ts: wrap placeholder generation in try-catch
2. toolbar.ts: check r.ok, display error message in popover
This commit is contained in:
2026-05-03 10:44:54 +07:00
parent 78f81bebb6
commit 2d1be52177
2352 changed files with 662964 additions and 0 deletions

View File

@@ -0,0 +1,59 @@
# @emdash-cms/auth
## 0.9.0
### Minor Changes
- [#800](https://github.com/emdash-cms/emdash/pull/800) [`e2d5d16`](https://github.com/emdash-cms/emdash/commit/e2d5d160acea4444945b1ea79c80ca9ce138965b) Thanks [@csfalcao](https://github.com/csfalcao)! - Adds support for accepting passkey assertions from multiple origins that share an `rpId`, for deployments reachable under several hostnames (apex + preview/staging) under one registrable parent. Declare additional origins via `EmDashConfig.allowedOrigins` (in `astro.config.mjs`) or the `EMDASH_ALLOWED_ORIGINS` env var (comma-separated); the two sources merge at runtime. EmDash validates the merged set against `siteUrl` and rejects dead config (non-subdomain entries, IP-literal `siteUrl`, trailing dots, empty labels) with source-attributed errors. `PasskeyConfig.origin: string` is replaced by `PasskeyConfig.origins: string[]`.
## 0.8.0
### Minor Changes
- [#779](https://github.com/emdash-cms/emdash/pull/779) [`e402890`](https://github.com/emdash-cms/emdash/commit/e402890fcd8647fdfe847bb34aa9f9e7094473dd) Thanks [@ascorbic](https://github.com/ascorbic)! - Adds `settings_get` and `settings_update` MCP tools so agents can read and update site-wide settings (title, tagline, logo, favicon, URL, posts-per-page, date format, timezone, social, SEO). `settings_get` resolves media references (logo/favicon/seo.defaultOgImage) to URLs; `settings_update` is a partial update that preserves omitted fields. New `settings:read` (EDITOR+) and `settings:manage` (ADMIN) API token scopes back the tools, with matching options in the personal API token settings UI.
### Patch Changes
- [#398](https://github.com/emdash-cms/emdash/pull/398) [`31333dc`](https://github.com/emdash-cms/emdash/commit/31333dc593e2b9128113e4e923455209f11853fd) Thanks [@simnaut](https://github.com/simnaut)! - Adds pluggable auth provider system with AT Protocol as the first plugin-based provider. Refactors GitHub and Google OAuth from hardcoded buttons into the same `AuthProviderDescriptor` interface. All auth methods (passkey, AT Protocol, GitHub, Google) are equal options on the login page and setup wizard.
- [#777](https://github.com/emdash-cms/emdash/pull/777) [`3eca9d5`](https://github.com/emdash-cms/emdash/commit/3eca9d54be03a803d35e112f4114f85f53a23acd) Thanks [@ascorbic](https://github.com/ascorbic)! - Adds `taxonomies:manage` and `menus:manage` API token scopes for fine-grained control over taxonomy and menu mutations via MCP and REST. Existing tokens with `content:write` continue to work for those operations: `content:write` now implicitly grants `menus:manage` and `taxonomies:manage` so PATs issued before the split keep their effective permissions. The reverse implication does not hold — a token with only `menus:manage` cannot create or edit content.
## 0.7.0
### Patch Changes
- [#736](https://github.com/emdash-cms/emdash/pull/736) [`81fe93b`](https://github.com/emdash-cms/emdash/commit/81fe93bc675581ddd0161eaabbe7a3471ec76529) Thanks [@ascorbic](https://github.com/ascorbic)! - Restricts Subscriber-role access to draft, scheduled, and trashed content. Subscribers retain `content:read` for member-only published content but no longer see non-published items via the REST API or MCP server. Adds a new `content:read_drafts` permission (Contributor and above) that gates `/compare`, `/revisions`, `/trash`, `/preview-url`, and the corresponding MCP tools.
## 0.6.0
### Patch Changes
- [#552](https://github.com/emdash-cms/emdash/pull/552) [`f52154d`](https://github.com/emdash-cms/emdash/commit/f52154da8afb838b1af6deccf33b5a261257ec7c) Thanks [@masonjames](https://github.com/masonjames)! - Fixes passkey login failures so unregistered or invalid credentials return an authentication failure instead of an internal server error.
## 0.5.0
### Patch Changes
- [#542](https://github.com/emdash-cms/emdash/pull/542) [`64f90d1`](https://github.com/emdash-cms/emdash/commit/64f90d1957af646ca200b9d70e856fa72393f001) Thanks [@mohamedmostafa58](https://github.com/mohamedmostafa58)! - Fixes invite flow: corrects invite URL to point to admin UI page, adds InviteAcceptPage for passkey registration.
## 0.4.0
## 0.3.0
## 0.2.0
### Patch Changes
- [#452](https://github.com/emdash-cms/emdash/pull/452) [`1a93d51`](https://github.com/emdash-cms/emdash/commit/1a93d51777afaec239641e7587d6e32d8a590656) Thanks [@kamine81](https://github.com/kamine81)! - Fixes GitHub OAuth login failing with 403 on accounts where email is private. GitHub's API requires a `User-Agent` header and rejects requests without it.
## 0.1.1
### Patch Changes
- [#133](https://github.com/emdash-cms/emdash/pull/133) [`9269759`](https://github.com/emdash-cms/emdash/commit/9269759674bf254863f37d4cf1687fae56082063) Thanks [@kyjus25](https://github.com/kyjus25)! - Fix auth links and OAuth callbacks to use `/_emdash/api/auth/...` so emailed sign-in, signup, and invite URLs resolve correctly in EmDash.
## 0.1.0
### Minor Changes
- [#14](https://github.com/emdash-cms/emdash/pull/14) [`755b501`](https://github.com/emdash-cms/emdash/commit/755b5017906811f97f78f4c0b5a0b62e67b52ec4) Thanks [@ascorbic](https://github.com/ascorbic)! - First beta release

View File

@@ -0,0 +1,72 @@
{
"name": "@emdash-cms/auth",
"version": "0.9.0",
"description": "Passkey-first authentication for EmDash",
"type": "module",
"main": "dist/index.mjs",
"files": [
"dist",
"src"
],
"exports": {
".": {
"types": "./dist/index.d.mts",
"default": "./dist/index.mjs"
},
"./passkey": {
"types": "./dist/passkey/index.d.mts",
"default": "./dist/passkey/index.mjs"
},
"./adapters/kysely": {
"types": "./dist/adapters/kysely.d.mts",
"default": "./dist/adapters/kysely.mjs"
},
"./oauth/github": {
"types": "./dist/oauth/providers/github.d.mts",
"default": "./dist/oauth/providers/github.mjs"
},
"./oauth/google": {
"types": "./dist/oauth/providers/google.d.mts",
"default": "./dist/oauth/providers/google.mjs"
}
},
"scripts": {
"build": "tsdown",
"dev": "tsdown --watch",
"check": "publint && attw --pack --ignore-rules=cjs-resolves-to-esm --ignore-rules=no-resolution",
"test": "vitest",
"typecheck": "tsgo --noEmit"
},
"dependencies": {
"@oslojs/crypto": "catalog:",
"@oslojs/encoding": "catalog:",
"@oslojs/webauthn": "catalog:",
"ulidx": "^2.4.1",
"zod": "^4.3.5"
},
"peerDependencies": {
"astro": ">=6.0.0-beta.0",
"kysely": "^0.27.0"
},
"peerDependenciesMeta": {
"kysely": {
"optional": true
}
},
"devDependencies": {
"@arethetypeswrong/cli": "catalog:",
"@types/node": "catalog:",
"astro": "catalog:",
"publint": "catalog:",
"tsdown": "catalog:",
"typescript": "catalog:",
"vitest": "catalog:"
},
"repository": {
"type": "git",
"url": "git+https://github.com/emdash-cms/emdash.git",
"directory": "packages/auth"
},
"author": "Matt Kane",
"license": "MIT"
}

View File

@@ -0,0 +1,715 @@
/**
* Kysely database adapter for @emdash-cms/auth
*/
import type { Kysely, Insertable, Selectable, Updateable } from "kysely";
import { ulid } from "ulidx";
import {
Role,
toRoleLevel,
toDeviceType,
toTokenType,
type AuthAdapter,
type User,
type NewUser,
type UpdateUser,
type Credential,
type NewCredential,
type AuthToken,
type NewAuthToken,
type TokenType,
type OAuthAccount,
type NewOAuthAccount,
type AllowedDomain,
type RoleLevel,
} from "../types.js";
// ============================================================================
// Database schema types
// ============================================================================
export interface AuthTables {
users: UserTable;
credentials: CredentialTable;
auth_tokens: AuthTokenTable;
oauth_accounts: OAuthAccountTable;
allowed_domains: AllowedDomainTable;
}
interface UserTable {
id: string;
email: string;
name: string | null;
avatar_url: string | null;
role: number;
email_verified: number;
disabled: number;
data: string | null;
created_at: string;
updated_at: string;
}
interface CredentialTable {
id: string;
user_id: string;
public_key: Uint8Array;
counter: number;
device_type: string;
backed_up: number;
transports: string | null;
name: string | null;
created_at: string;
last_used_at: string;
}
interface AuthTokenTable {
hash: string;
user_id: string | null;
email: string | null;
type: string;
role: number | null;
invited_by: string | null;
expires_at: string;
created_at: string;
}
interface OAuthAccountTable {
provider: string;
provider_account_id: string;
user_id: string;
created_at: string;
}
interface AllowedDomainTable {
domain: string;
default_role: number;
enabled: number;
created_at: string;
}
// ============================================================================
// Adapter implementation
// ============================================================================
export function createKyselyAdapter<T extends AuthTables>(db: Kysely<T>): AuthAdapter {
// Type cast to work with generic Kysely instance
// eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- generic Kysely<T extends AuthTables> narrowed to concrete AuthTables for internal queries
const kdb = db as unknown as Kysely<AuthTables>;
return {
// ========================================================================
// Users
// ========================================================================
async getUserById(id: string): Promise<User | null> {
const row = await kdb.selectFrom("users").selectAll().where("id", "=", id).executeTakeFirst();
return row ? rowToUser(row) : null;
},
async getUserByEmail(email: string): Promise<User | null> {
const row = await kdb
.selectFrom("users")
.selectAll()
.where("email", "=", email.toLowerCase())
.executeTakeFirst();
return row ? rowToUser(row) : null;
},
async createUser(user: NewUser): Promise<User> {
const now = new Date().toISOString();
const id = ulid();
const row: Insertable<UserTable> = {
id,
email: user.email.toLowerCase(),
name: user.name ?? null,
avatar_url: user.avatarUrl ?? null,
role: user.role ?? Role.SUBSCRIBER,
email_verified: user.emailVerified ? 1 : 0,
disabled: 0,
data: user.data ? JSON.stringify(user.data) : null,
created_at: now,
updated_at: now,
};
await kdb.insertInto("users").values(row).execute();
return {
id,
email: row.email,
name: user.name ?? null,
avatarUrl: user.avatarUrl ?? null,
role: toRoleLevel(row.role),
emailVerified: row.email_verified === 1,
disabled: false,
data: user.data ?? null,
createdAt: new Date(now),
updatedAt: new Date(now),
};
},
async updateUser(id: string, data: UpdateUser): Promise<void> {
const update: Updateable<UserTable> = {
updated_at: new Date().toISOString(),
};
if (data.email !== undefined) update.email = data.email.toLowerCase();
if (data.name !== undefined) update.name = data.name;
if (data.avatarUrl !== undefined) update.avatar_url = data.avatarUrl;
if (data.role !== undefined) update.role = data.role;
if (data.emailVerified !== undefined) update.email_verified = data.emailVerified ? 1 : 0;
if (data.disabled !== undefined) update.disabled = data.disabled ? 1 : 0;
if (data.data !== undefined) update.data = data.data ? JSON.stringify(data.data) : null;
await kdb.updateTable("users").set(update).where("id", "=", id).execute();
},
async deleteUser(id: string): Promise<void> {
await kdb.deleteFrom("users").where("id", "=", id).execute();
},
async countUsers(): Promise<number> {
const result = await kdb
.selectFrom("users")
.select((eb) => eb.fn.countAll<number>().as("count"))
.executeTakeFirstOrThrow();
return result.count;
},
async getUsers(options?: {
search?: string;
role?: number;
cursor?: string;
limit?: number;
}): Promise<{
items: Array<
User & {
lastLogin: Date | null;
credentialCount: number;
oauthProviders: string[];
}
>;
nextCursor?: string;
}> {
const limit = Math.min(options?.limit ?? 20, 100);
let query = kdb
.selectFrom("users")
.leftJoin("credentials", "users.id", "credentials.user_id")
.selectAll("users")
.select((eb) => [
eb.fn.count<number>("credentials.id").as("credential_count"),
eb.fn.max("credentials.last_used_at").as("last_login"),
])
.groupBy("users.id")
.orderBy("users.created_at", "desc")
.limit(limit + 1);
// Apply filters
if (options?.search) {
const searchPattern = `%${options.search}%`;
query = query.where((eb) =>
eb.or([
eb("users.email", "like", searchPattern),
eb("users.name", "like", searchPattern),
]),
);
}
if (options?.role !== undefined) {
query = query.where("users.role", "=", options.role);
}
if (options?.cursor) {
// Get the cursor user's created_at for pagination
const cursorUser = await kdb
.selectFrom("users")
.select("created_at")
.where("id", "=", options.cursor)
.executeTakeFirst();
if (cursorUser) {
query = query.where("users.created_at", "<", cursorUser.created_at);
}
}
const rows = await query.execute();
// Get OAuth providers for all users in this batch
const userIds = rows.slice(0, limit).map((r) => r.id);
const oauthAccounts =
userIds.length > 0
? await kdb
.selectFrom("oauth_accounts")
.select(["user_id", "provider"])
.where("user_id", "in", userIds)
.execute()
: [];
// Group OAuth providers by user
const oauthByUser = new Map<string, string[]>();
for (const account of oauthAccounts) {
const providers = oauthByUser.get(account.user_id) ?? [];
providers.push(account.provider);
oauthByUser.set(account.user_id, providers);
}
const hasMore = rows.length > limit;
const items = rows.slice(0, limit).map((row) => ({
id: row.id,
email: row.email,
name: row.name,
avatarUrl: row.avatar_url,
role: toRoleLevel(row.role),
emailVerified: row.email_verified === 1,
disabled: row.disabled === 1,
data: row.data ? JSON.parse(row.data) : null,
createdAt: new Date(row.created_at),
updatedAt: new Date(row.updated_at),
lastLogin: row.last_login ? new Date(row.last_login) : null,
credentialCount: row.credential_count ?? 0,
oauthProviders: oauthByUser.get(row.id) ?? [],
}));
return {
items,
nextCursor: hasMore ? items.at(-1)?.id : undefined,
};
},
async getUserWithDetails(id: string): Promise<{
user: User;
credentials: Credential[];
oauthAccounts: OAuthAccount[];
lastLogin: Date | null;
} | null> {
const user = await kdb
.selectFrom("users")
.selectAll()
.where("id", "=", id)
.executeTakeFirst();
if (!user) return null;
const [credentials, oauthAccounts] = await Promise.all([
kdb
.selectFrom("credentials")
.selectAll()
.where("user_id", "=", id)
.orderBy("created_at", "desc")
.execute(),
kdb.selectFrom("oauth_accounts").selectAll().where("user_id", "=", id).execute(),
]);
// Find last login from most recent credential use
const lastLogin = credentials.reduce<Date | null>((latest, cred) => {
const lastUsed = new Date(cred.last_used_at);
return !latest || lastUsed > latest ? lastUsed : latest;
}, null);
return {
user: rowToUser(user),
credentials: credentials.map(rowToCredential),
oauthAccounts: oauthAccounts.map(rowToOAuthAccount),
lastLogin,
};
},
async countAdmins(): Promise<number> {
const result = await kdb
.selectFrom("users")
.select((eb) => eb.fn.countAll<number>().as("count"))
.where("role", "=", Role.ADMIN)
.where("disabled", "=", 0)
.executeTakeFirstOrThrow();
return result.count;
},
// ========================================================================
// Credentials
// ========================================================================
async getCredentialById(id: string): Promise<Credential | null> {
const row = await kdb
.selectFrom("credentials")
.selectAll()
.where("id", "=", id)
.executeTakeFirst();
return row ? rowToCredential(row) : null;
},
async getCredentialsByUserId(userId: string): Promise<Credential[]> {
const rows = await kdb
.selectFrom("credentials")
.selectAll()
.where("user_id", "=", userId)
.execute();
return rows.map(rowToCredential);
},
async createCredential(credential: NewCredential): Promise<Credential> {
const now = new Date().toISOString();
const row: Insertable<CredentialTable> = {
id: credential.id,
user_id: credential.userId,
public_key: credential.publicKey,
counter: credential.counter,
device_type: credential.deviceType,
backed_up: credential.backedUp ? 1 : 0,
transports: credential.transports.length > 0 ? JSON.stringify(credential.transports) : null,
name: credential.name ?? null,
created_at: now,
last_used_at: now,
};
await kdb.insertInto("credentials").values(row).execute();
return {
id: credential.id,
userId: credential.userId,
publicKey: credential.publicKey,
counter: credential.counter,
deviceType: credential.deviceType,
backedUp: credential.backedUp,
transports: credential.transports,
name: credential.name ?? null,
createdAt: new Date(now),
lastUsedAt: new Date(now),
};
},
async updateCredentialCounter(id: string, counter: number): Promise<void> {
await kdb
.updateTable("credentials")
.set({
counter,
last_used_at: new Date().toISOString(),
})
.where("id", "=", id)
.execute();
},
async updateCredentialName(id: string, name: string | null): Promise<void> {
await kdb.updateTable("credentials").set({ name }).where("id", "=", id).execute();
},
async deleteCredential(id: string): Promise<void> {
await kdb.deleteFrom("credentials").where("id", "=", id).execute();
},
async countCredentialsByUserId(userId: string): Promise<number> {
const result = await kdb
.selectFrom("credentials")
.select((eb) => eb.fn.countAll<number>().as("count"))
.where("user_id", "=", userId)
.executeTakeFirstOrThrow();
return result.count;
},
// ========================================================================
// Auth Tokens
// ========================================================================
async createToken(token: NewAuthToken): Promise<void> {
const row: Insertable<AuthTokenTable> = {
hash: token.hash,
user_id: token.userId ?? null,
email: token.email ?? null,
type: token.type,
role: token.role ?? null,
invited_by: token.invitedBy ?? null,
expires_at: token.expiresAt.toISOString(),
created_at: new Date().toISOString(),
};
await kdb.insertInto("auth_tokens").values(row).execute();
},
async getToken(hash: string, type: TokenType): Promise<AuthToken | null> {
const row = await kdb
.selectFrom("auth_tokens")
.selectAll()
.where("hash", "=", hash)
.where("type", "=", type)
.executeTakeFirst();
return row ? rowToAuthToken(row) : null;
},
async deleteToken(hash: string): Promise<void> {
await kdb.deleteFrom("auth_tokens").where("hash", "=", hash).execute();
},
async deleteExpiredTokens(): Promise<void> {
await kdb
.deleteFrom("auth_tokens")
.where("expires_at", "<", new Date().toISOString())
.execute();
},
// ========================================================================
// OAuth Accounts
// ========================================================================
async getOAuthAccount(
provider: string,
providerAccountId: string,
): Promise<OAuthAccount | null> {
const row = await kdb
.selectFrom("oauth_accounts")
.selectAll()
.where("provider", "=", provider)
.where("provider_account_id", "=", providerAccountId)
.executeTakeFirst();
return row ? rowToOAuthAccount(row) : null;
},
async getOAuthAccountsByUserId(userId: string): Promise<OAuthAccount[]> {
const rows = await kdb
.selectFrom("oauth_accounts")
.selectAll()
.where("user_id", "=", userId)
.execute();
return rows.map(rowToOAuthAccount);
},
async createOAuthAccount(account: NewOAuthAccount): Promise<OAuthAccount> {
const now = new Date().toISOString();
const row: Insertable<OAuthAccountTable> = {
provider: account.provider,
provider_account_id: account.providerAccountId,
user_id: account.userId,
created_at: now,
};
await kdb.insertInto("oauth_accounts").values(row).execute();
return {
provider: account.provider,
providerAccountId: account.providerAccountId,
userId: account.userId,
createdAt: new Date(now),
};
},
async deleteOAuthAccount(provider: string, providerAccountId: string): Promise<void> {
await kdb
.deleteFrom("oauth_accounts")
.where("provider", "=", provider)
.where("provider_account_id", "=", providerAccountId)
.execute();
},
// ========================================================================
// Allowed Domains
// ========================================================================
async getAllowedDomain(domain: string): Promise<AllowedDomain | null> {
const row = await kdb
.selectFrom("allowed_domains")
.selectAll()
.where("domain", "=", domain.toLowerCase())
.executeTakeFirst();
return row ? rowToAllowedDomain(row) : null;
},
async getAllowedDomains(): Promise<AllowedDomain[]> {
const rows = await kdb.selectFrom("allowed_domains").selectAll().execute();
return rows.map(rowToAllowedDomain);
},
async createAllowedDomain(domain: string, defaultRole: RoleLevel): Promise<AllowedDomain> {
const now = new Date().toISOString();
const row: Insertable<AllowedDomainTable> = {
domain: domain.toLowerCase(),
default_role: defaultRole,
enabled: 1,
created_at: now,
};
await kdb.insertInto("allowed_domains").values(row).execute();
return {
domain: row.domain,
defaultRole,
enabled: true,
createdAt: new Date(now),
};
},
async updateAllowedDomain(
domain: string,
enabled: boolean,
defaultRole?: RoleLevel,
): Promise<void> {
const update: Updateable<AllowedDomainTable> = {
enabled: enabled ? 1 : 0,
};
if (defaultRole !== undefined) {
update.default_role = defaultRole;
}
await kdb
.updateTable("allowed_domains")
.set(update)
.where("domain", "=", domain.toLowerCase())
.execute();
},
async deleteAllowedDomain(domain: string): Promise<void> {
await kdb.deleteFrom("allowed_domains").where("domain", "=", domain.toLowerCase()).execute();
},
};
}
// ============================================================================
// Row converters
// ============================================================================
function rowToUser(row: Selectable<UserTable>): User {
return {
id: row.id,
email: row.email,
name: row.name,
avatarUrl: row.avatar_url,
role: toRoleLevel(row.role),
emailVerified: row.email_verified === 1,
disabled: row.disabled === 1,
data: row.data ? JSON.parse(row.data) : null,
createdAt: new Date(row.created_at),
updatedAt: new Date(row.updated_at),
};
}
function rowToCredential(row: Selectable<CredentialTable>): Credential {
return {
id: row.id,
userId: row.user_id,
publicKey: row.public_key,
counter: row.counter,
deviceType: toDeviceType(row.device_type),
backedUp: row.backed_up === 1,
transports: row.transports ? JSON.parse(row.transports) : [],
name: row.name,
createdAt: new Date(row.created_at),
lastUsedAt: new Date(row.last_used_at),
};
}
function rowToAuthToken(row: Selectable<AuthTokenTable>): AuthToken {
return {
hash: row.hash,
userId: row.user_id,
email: row.email,
type: toTokenType(row.type),
role: row.role != null ? toRoleLevel(row.role) : null,
invitedBy: row.invited_by,
expiresAt: new Date(row.expires_at),
createdAt: new Date(row.created_at),
};
}
function rowToOAuthAccount(row: Selectable<OAuthAccountTable>): OAuthAccount {
return {
provider: row.provider,
providerAccountId: row.provider_account_id,
userId: row.user_id,
createdAt: new Date(row.created_at),
};
}
function rowToAllowedDomain(row: Selectable<AllowedDomainTable>): AllowedDomain {
return {
domain: row.domain,
defaultRole: toRoleLevel(row.default_role),
enabled: row.enabled === 1,
createdAt: new Date(row.created_at),
};
}
// ============================================================================
// Migration SQL
// ============================================================================
export const AUTH_TABLES_SQL = `
-- Users (no password_hash)
CREATE TABLE IF NOT EXISTS users (
id TEXT PRIMARY KEY,
email TEXT UNIQUE NOT NULL,
name TEXT,
avatar_url TEXT,
role INTEGER NOT NULL DEFAULT 10,
email_verified INTEGER NOT NULL DEFAULT 0,
disabled INTEGER NOT NULL DEFAULT 0,
data TEXT,
created_at TEXT NOT NULL,
updated_at TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_users_email ON users(email);
-- Passkey credentials
CREATE TABLE IF NOT EXISTS credentials (
id TEXT PRIMARY KEY,
user_id TEXT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
public_key BLOB NOT NULL,
counter INTEGER NOT NULL DEFAULT 0,
device_type TEXT NOT NULL,
backed_up INTEGER NOT NULL DEFAULT 0,
transports TEXT,
name TEXT,
created_at TEXT NOT NULL,
last_used_at TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_credentials_user ON credentials(user_id);
-- Auth tokens (magic links, email verification, invites)
CREATE TABLE IF NOT EXISTS auth_tokens (
hash TEXT PRIMARY KEY,
user_id TEXT REFERENCES users(id) ON DELETE CASCADE,
email TEXT,
type TEXT NOT NULL,
role INTEGER,
invited_by TEXT REFERENCES users(id),
expires_at TEXT NOT NULL,
created_at TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_auth_tokens_email ON auth_tokens(email);
-- OAuth accounts (external provider links)
CREATE TABLE IF NOT EXISTS oauth_accounts (
provider TEXT NOT NULL,
provider_account_id TEXT NOT NULL,
user_id TEXT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
created_at TEXT NOT NULL,
PRIMARY KEY (provider, provider_account_id)
);
CREATE INDEX IF NOT EXISTS idx_oauth_accounts_user ON oauth_accounts(user_id);
-- Allowed domains for self-signup
CREATE TABLE IF NOT EXISTS allowed_domains (
domain TEXT PRIMARY KEY,
default_role INTEGER NOT NULL DEFAULT 20,
enabled INTEGER NOT NULL DEFAULT 1,
created_at TEXT NOT NULL
);
`;

214
packages/auth/src/config.ts Normal file
View File

@@ -0,0 +1,214 @@
/**
* Configuration schema for @emdash-cms/auth
*/
import { z } from "zod";
import type { RoleName } from "./types.js";
/** Matches http(s) scheme at start of URL */
const HTTP_SCHEME_RE = /^https?:\/\//i;
/** Validates that a URL string uses http or https scheme. Rejects javascript:/data: URI XSS vectors. */
const httpUrl = z
.string()
.url()
.refine((url) => HTTP_SCHEME_RE.test(url), "URL must use http or https");
/**
* OAuth provider configuration
*/
const oauthProviderSchema = z.object({
clientId: z.string(),
clientSecret: z.string(),
});
/**
* Full auth configuration schema
*/
export const authConfigSchema = z.object({
/**
* Secret key for encrypting tokens and session data.
* Generate with: `emdash auth secret`
*/
secret: z.string().min(32, "Auth secret must be at least 32 characters"),
/**
* Passkey (WebAuthn) configuration
*/
passkeys: z
.object({
/**
* Relying party name shown to users during passkey registration
*/
rpName: z.string(),
/**
* Relying party ID (domain). Defaults to the hostname from baseUrl.
*/
rpId: z.string().optional(),
})
.optional(),
/**
* Self-signup configuration
*/
selfSignup: z
.object({
/**
* Email domains allowed to self-register
*/
domains: z.array(z.string()),
/**
* Default role for self-registered users
*/
defaultRole: z.enum(["subscriber", "contributor", "author"] as const).default("contributor"),
})
.optional(),
/**
* OAuth provider configurations (for "Login with X")
*/
oauth: z
.object({
github: oauthProviderSchema.optional(),
google: oauthProviderSchema.optional(),
})
.optional(),
/**
* Configure EmDash as an OAuth provider
*/
provider: z
.object({
enabled: z.boolean(),
/**
* Issuer URL for OIDC. Defaults to site URL.
*/
issuer: httpUrl.optional(),
})
.optional(),
/**
* Enterprise SSO configuration
*/
sso: z
.object({
enabled: z.boolean(),
})
.optional(),
/**
* Session configuration
*/
session: z
.object({
/**
* Session max age in seconds. Default: 30 days
*/
maxAge: z.number().default(30 * 24 * 60 * 60),
/**
* Extend session on activity. Default: true
*/
sliding: z.boolean().default(true),
})
.optional(),
});
export type AuthConfig = z.infer<typeof authConfigSchema>;
/**
* Validated and resolved auth configuration
*/
export interface ResolvedAuthConfig {
secret: string;
baseUrl: string;
siteName: string;
passkeys: {
rpName: string;
rpId: string;
origin: string;
};
selfSignup?: {
domains: string[];
defaultRole: RoleName;
};
oauth?: {
github?: {
clientId: string;
clientSecret: string;
};
google?: {
clientId: string;
clientSecret: string;
};
};
provider?: {
enabled: boolean;
issuer: string;
};
sso?: {
enabled: boolean;
};
session: {
maxAge: number;
sliding: boolean;
};
}
const selfSignupRoleMap: Record<"subscriber" | "contributor" | "author", RoleName> = {
subscriber: "SUBSCRIBER",
contributor: "CONTRIBUTOR",
author: "AUTHOR",
};
/**
* Resolve auth configuration with defaults
*/
export function resolveConfig(
config: AuthConfig,
baseUrl: string,
siteName: string,
): ResolvedAuthConfig {
const url = new URL(baseUrl);
return {
secret: config.secret,
baseUrl,
siteName,
passkeys: {
rpName: config.passkeys?.rpName ?? siteName,
rpId: config.passkeys?.rpId ?? url.hostname,
origin: url.origin,
},
selfSignup: config.selfSignup
? {
domains: config.selfSignup.domains.map((d) => d.toLowerCase()),
defaultRole: selfSignupRoleMap[config.selfSignup.defaultRole],
}
: undefined,
oauth: config.oauth,
provider: config.provider
? {
enabled: config.provider.enabled,
issuer: config.provider.issuer ?? baseUrl,
}
: undefined,
sso: config.sso,
session: {
maxAge: config.session?.maxAge ?? 30 * 24 * 60 * 60,
sliding: config.session?.sliding ?? true,
},
};
}

137
packages/auth/src/index.ts Normal file
View File

@@ -0,0 +1,137 @@
/**
* @emdash-cms/auth - Passkey-first authentication for EmDash
*
* Email is now handled by the plugin email pipeline (see PLUGIN-EMAIL.md).
* Auth functions accept an optional `email` send function instead of a
* hardcoded adapter. The route layer bridges `emdash.email.send()` from
* the pipeline into the auth functions.
*
* @example
* ```ts
* import { auth } from '@emdash-cms/auth'
*
* export default defineConfig({
* integrations: [
* emdash({
* auth: auth({
* secret: import.meta.env.EMDASH_AUTH_SECRET,
* passkeys: { rpName: 'My Site' },
* }),
* }),
* ],
* })
* ```
*/
// Types
export * from "./types.js";
// Config
import { authConfigSchema as _authConfigSchema } from "./config.js";
export {
authConfigSchema,
resolveConfig,
type AuthConfig,
type ResolvedAuthConfig,
} from "./config.js";
// RBAC
export {
Permissions,
hasPermission,
requirePermission,
canActOnOwn,
requirePermissionOnResource,
PermissionError,
scopesForRole,
clampScopes,
type Permission,
} from "./rbac.js";
// Tokens
export {
generateToken,
hashToken,
generateTokenWithHash,
generateSessionId,
generateAuthSecret,
secureCompare,
encrypt,
decrypt,
// Prefixed API tokens (ec_pat_, ec_oat_, ec_ort_)
TOKEN_PREFIXES,
generatePrefixedToken,
hashPrefixedToken,
// Scopes
VALID_SCOPES,
validateScopes,
hasScope,
type ApiTokenScope,
// PKCE
computeS256Challenge,
} from "./tokens.js";
// Passkey
export * from "./passkey/index.js";
// Magic Link
export {
sendMagicLink,
verifyMagicLink,
MagicLinkError,
type MagicLinkConfig,
} from "./magic-link/index.js";
// Invite
export {
createInvite,
createInviteToken,
validateInvite,
completeInvite,
InviteError,
escapeHtml,
type InviteConfig,
type InviteTokenResult,
type EmailSendFn,
} from "./invite.js";
// Signup
export {
canSignup,
requestSignup,
validateSignupToken,
completeSignup,
SignupError,
type SignupConfig,
} from "./signup.js";
// OAuth
export {
createAuthorizationUrl,
handleOAuthCallback,
findOrCreateOAuthUser,
OAuthError,
github,
google,
type CanSelfSignup,
type StateStore,
type OAuthConsumerConfig,
} from "./oauth/consumer.js";
export type { OAuthProvider, OAuthConfig, OAuthProfile, OAuthState } from "./oauth/types.js";
// Email types (implementations moved to plugin email pipeline)
export type { EmailAdapter, EmailMessage } from "./types.js";
/**
* Create an auth configuration
*
* This is a helper function that validates the config at runtime.
*/
export function auth(config: import("./config.js").AuthConfig): import("./config.js").AuthConfig {
// Validate config
const result = _authConfigSchema.safeParse(config);
if (!result.success) {
throw new Error(`Invalid auth config: ${result.error.message}`);
}
return result.data;
}

205
packages/auth/src/invite.ts Normal file
View File

@@ -0,0 +1,205 @@
/**
* Invite system for new users
*/
import { generateTokenWithHash, hashToken } from "./tokens.js";
import type { AuthAdapter, RoleLevel, EmailMessage, User } from "./types.js";
/** Escape HTML special characters to prevent injection in email templates */
export function escapeHtml(s: string): string {
return s
.replaceAll("&", "&amp;")
.replaceAll("<", "&lt;")
.replaceAll(">", "&gt;")
.replaceAll('"', "&quot;");
}
const TOKEN_EXPIRY_MS = 7 * 24 * 60 * 60 * 1000; // 7 days
/** Function that sends an email (matches the EmailPipeline.send signature) */
export type EmailSendFn = (message: EmailMessage) => Promise<void>;
export interface InviteConfig {
baseUrl: string;
siteName: string;
/** Optional email sender. When omitted, invite URL is returned without sending. */
email?: EmailSendFn;
}
/** Result of creating an invite token (without sending email) */
export interface InviteTokenResult {
/** The complete invite URL */
url: string;
/** The invite email address */
email: string;
}
/**
* Create an invite token and URL without sending email.
*
* Validates the user doesn't already exist, generates a token, stores it,
* and returns the invite URL. Callers decide whether to send email or
* display the URL as a copy-link fallback.
*/
export async function createInviteToken(
config: Pick<InviteConfig, "baseUrl">,
adapter: AuthAdapter,
email: string,
role: RoleLevel,
invitedBy: string,
): Promise<InviteTokenResult> {
// Check if user already exists
const existing = await adapter.getUserByEmail(email);
if (existing) {
throw new InviteError("user_exists", "A user with this email already exists");
}
// Generate token
const { token, hash } = generateTokenWithHash();
// Store token
await adapter.createToken({
hash,
email,
type: "invite",
role,
invitedBy,
expiresAt: new Date(Date.now() + TOKEN_EXPIRY_MS),
});
// Build invite URL pointing to the admin UI page (not the API endpoint).
// The admin SPA handles token validation and passkey registration.
const url = new URL(`${config.baseUrl}/admin/invite/accept`);
url.searchParams.set("token", token);
return { url: url.toString(), email };
}
/**
* Build the invite email message.
*/
function buildInviteEmail(inviteUrl: string, email: string, siteName: string): EmailMessage {
const safeName = escapeHtml(siteName);
return {
to: email,
subject: `You've been invited to ${siteName}`,
text: `You've been invited to join ${siteName}.\n\nClick this link to create your account:\n${inviteUrl}\n\nThis link expires in 7 days.`,
html: `
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body style="font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif; line-height: 1.5; color: #333; max-width: 600px; margin: 0 auto; padding: 20px;">
<h1 style="font-size: 24px; margin-bottom: 20px;">You've been invited to ${safeName}</h1>
<p>Click the button below to create your account:</p>
<p style="margin: 30px 0;">
<a href="${inviteUrl}" style="background-color: #0066cc; color: white; padding: 12px 24px; text-decoration: none; border-radius: 6px; display: inline-block;">Accept Invite</a>
</p>
<p style="color: #666; font-size: 14px;">This link expires in 7 days.</p>
</body>
</html>`,
};
}
/**
* Create and send an invite to a new user.
*
* When `config.email` is provided, sends the invite email.
* When omitted, creates the token and returns the invite URL
* without sending (for the copy-link fallback).
*/
export async function createInvite(
config: InviteConfig,
adapter: AuthAdapter,
email: string,
role: RoleLevel,
invitedBy: string,
): Promise<InviteTokenResult> {
const result = await createInviteToken(config, adapter, email, role, invitedBy);
// Send email if a sender is configured
if (config.email) {
const message = buildInviteEmail(result.url, email, config.siteName);
await config.email(message);
}
return result;
}
/**
* Validate an invite token and return the invite data
*/
export async function validateInvite(
adapter: AuthAdapter,
token: string,
): Promise<{ email: string; role: RoleLevel }> {
const hash = hashToken(token);
const authToken = await adapter.getToken(hash, "invite");
if (!authToken) {
throw new InviteError("invalid_token", "Invalid or expired invite link");
}
if (authToken.expiresAt < new Date()) {
await adapter.deleteToken(hash);
throw new InviteError("token_expired", "This invite has expired");
}
if (!authToken.email || authToken.role === null) {
throw new InviteError("invalid_token", "Invalid invite data");
}
return {
email: authToken.email,
role: authToken.role,
};
}
/**
* Complete the invite process (after passkey registration)
*/
export async function completeInvite(
adapter: AuthAdapter,
token: string,
userData: {
name?: string;
avatarUrl?: string;
},
): Promise<User> {
const hash = hashToken(token);
// Validate token one more time
const authToken = await adapter.getToken(hash, "invite");
if (!authToken || authToken.expiresAt < new Date()) {
throw new InviteError("invalid_token", "Invalid or expired invite");
}
if (!authToken.email || authToken.role === null) {
throw new InviteError("invalid_token", "Invalid invite data");
}
// Delete token (single-use)
await adapter.deleteToken(hash);
// Create user
const user = await adapter.createUser({
email: authToken.email,
name: userData.name,
avatarUrl: userData.avatarUrl,
role: authToken.role,
emailVerified: true, // Email verified by accepting invite
});
return user;
}
export class InviteError extends Error {
constructor(
public code: "invalid_token" | "token_expired" | "user_exists",
message: string,
) {
super(message);
this.name = "InviteError";
}
}

View File

@@ -0,0 +1,150 @@
/**
* Magic link authentication
*/
import { escapeHtml } from "../invite.js";
import { generateTokenWithHash, hashToken } from "../tokens.js";
import type { AuthAdapter, User, EmailMessage } from "../types.js";
const TOKEN_EXPIRY_MS = 15 * 60 * 1000; // 15 minutes
/** Function that sends an email (matches the EmailPipeline.send signature) */
export type EmailSendFn = (message: EmailMessage) => Promise<void>;
export interface MagicLinkConfig {
baseUrl: string;
siteName: string;
/** Optional email sender. When omitted, magic links cannot be sent. */
email?: EmailSendFn;
}
/**
* Add artificial delay with jitter to prevent timing attacks.
* Range approximates the time for token creation + email send.
*/
async function timingDelay(): Promise<void> {
const delay = 100 + Math.random() * 150; // 100-250ms
await new Promise((resolve) => setTimeout(resolve, delay));
}
/**
* Send a magic link to a user's email.
*
* Requires `config.email` to be set. Throws if no email sender is configured.
*/
export async function sendMagicLink(
config: MagicLinkConfig,
adapter: AuthAdapter,
email: string,
type: "magic_link" | "recovery" = "magic_link",
): Promise<void> {
if (!config.email) {
throw new MagicLinkError("email_not_configured", "Email is not configured");
}
// Find user
const user = await adapter.getUserByEmail(email);
if (!user) {
// Don't reveal whether user exists - add delay to match successful path timing
await timingDelay();
return;
}
// Generate token
const { token, hash } = generateTokenWithHash();
// Store token hash
await adapter.createToken({
hash,
userId: user.id,
email: user.email,
type,
expiresAt: new Date(Date.now() + TOKEN_EXPIRY_MS),
});
// Build magic link URL
const url = new URL("/_emdash/api/auth/magic-link/verify", config.baseUrl);
url.searchParams.set("token", token);
// Send email
const safeName = escapeHtml(config.siteName);
await config.email({
to: user.email,
subject: `Sign in to ${config.siteName}`,
text: `Click this link to sign in to ${config.siteName}:\n\n${url.toString()}\n\nThis link expires in 15 minutes.\n\nIf you didn't request this, you can safely ignore this email.`,
html: `
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body style="font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif; line-height: 1.5; color: #333; max-width: 600px; margin: 0 auto; padding: 20px;">
<h1 style="font-size: 24px; margin-bottom: 20px;">Sign in to ${safeName}</h1>
<p>Click the button below to sign in:</p>
<p style="margin: 30px 0;">
<a href="${url.toString()}" style="background-color: #0066cc; color: white; padding: 12px 24px; text-decoration: none; border-radius: 6px; display: inline-block;">Sign in</a>
</p>
<p style="color: #666; font-size: 14px;">This link expires in 15 minutes.</p>
<p style="color: #666; font-size: 14px;">If you didn't request this, you can safely ignore this email.</p>
</body>
</html>`,
});
}
/**
* Verify a magic link token and return the user
*/
export async function verifyMagicLink(adapter: AuthAdapter, token: string): Promise<User> {
const hash = hashToken(token);
// Find and validate token
const authToken = await adapter.getToken(hash, "magic_link");
if (!authToken) {
// Also check for recovery tokens
const recoveryToken = await adapter.getToken(hash, "recovery");
if (!recoveryToken) {
throw new MagicLinkError("invalid_token", "Invalid or expired link");
}
return verifyTokenAndGetUser(adapter, recoveryToken, hash);
}
return verifyTokenAndGetUser(adapter, authToken, hash);
}
async function verifyTokenAndGetUser(
adapter: AuthAdapter,
authToken: { userId: string | null; expiresAt: Date },
hash: string,
): Promise<User> {
// Check expiry
if (authToken.expiresAt < new Date()) {
await adapter.deleteToken(hash);
throw new MagicLinkError("token_expired", "This link has expired");
}
// Delete token (single-use)
await adapter.deleteToken(hash);
// Get user
if (!authToken.userId) {
throw new MagicLinkError("invalid_token", "Invalid token");
}
const user = await adapter.getUserById(authToken.userId);
if (!user) {
throw new MagicLinkError("user_not_found", "User not found");
}
return user;
}
export class MagicLinkError extends Error {
constructor(
public code: "invalid_token" | "token_expired" | "user_not_found" | "email_not_configured",
message: string,
) {
super(message);
this.name = "MagicLinkError";
}
}

View File

@@ -0,0 +1,340 @@
/**
* OAuth consumer - "Login with X" functionality
*/
import { sha256 } from "@oslojs/crypto/sha2";
import { encodeBase64urlNoPadding } from "@oslojs/encoding";
import { z } from "zod";
import type { AuthAdapter, User, RoleLevel } from "../types.js";
import { github, fetchGitHubEmail } from "./providers/github.js";
import { google } from "./providers/google.js";
import type { OAuthProvider, OAuthConfig, OAuthProfile, OAuthState } from "./types.js";
export { github, google };
export interface OAuthConsumerConfig {
baseUrl: string;
providers: {
github?: OAuthConfig;
google?: OAuthConfig;
};
/**
* Check if self-signup is allowed for this email domain
*/
canSelfSignup?: (email: string) => Promise<{ allowed: boolean; role: RoleLevel } | null>;
}
/**
* Generate an OAuth authorization URL
*/
export async function createAuthorizationUrl(
config: OAuthConsumerConfig,
providerName: "github" | "google",
stateStore: StateStore,
): Promise<{ url: string; state: string }> {
const providerConfig = config.providers[providerName];
if (!providerConfig) {
throw new Error(`OAuth provider ${providerName} not configured`);
}
const provider = getProvider(providerName);
const state = generateState();
const redirectUri = new URL(
`/_emdash/api/auth/oauth/${providerName}/callback`,
config.baseUrl,
).toString();
// Generate PKCE code verifier for providers that support it
const codeVerifier = generateCodeVerifier();
const codeChallenge = await generateCodeChallenge(codeVerifier);
// Store state for verification
await stateStore.set(state, {
provider: providerName,
redirectUri,
codeVerifier,
});
// Build authorization URL
const url = new URL(provider.authorizeUrl);
url.searchParams.set("client_id", providerConfig.clientId);
url.searchParams.set("redirect_uri", redirectUri);
url.searchParams.set("response_type", "code");
url.searchParams.set("scope", provider.scopes.join(" "));
url.searchParams.set("state", state);
// PKCE for all providers (GitHub has supported S256 since 2021)
url.searchParams.set("code_challenge", codeChallenge);
url.searchParams.set("code_challenge_method", "S256");
return { url: url.toString(), state };
}
/**
* Handle OAuth callback
*/
export async function handleOAuthCallback(
config: OAuthConsumerConfig,
adapter: AuthAdapter,
providerName: "github" | "google",
code: string,
state: string,
stateStore: StateStore,
): Promise<User> {
const providerConfig = config.providers[providerName];
if (!providerConfig) {
throw new Error(`OAuth provider ${providerName} not configured`);
}
// Verify state
const storedState = await stateStore.get(state);
if (!storedState || storedState.provider !== providerName) {
throw new OAuthError("invalid_state", "Invalid OAuth state");
}
// Delete state (single-use)
await stateStore.delete(state);
const provider = getProvider(providerName);
// Exchange code for tokens
const tokens = await exchangeCode(
provider,
providerConfig,
code,
storedState.redirectUri,
storedState.codeVerifier,
);
// Fetch user profile
const profile = await fetchProfile(provider, tokens.accessToken, providerName);
// Find or create user
return findOrCreateOAuthUser(adapter, providerName, profile, config.canSelfSignup);
}
/**
* Exchange authorization code for tokens
*/
async function exchangeCode(
provider: OAuthProvider,
config: OAuthConfig,
code: string,
redirectUri: string,
codeVerifier?: string,
): Promise<{ accessToken: string; idToken?: string }> {
const body = new URLSearchParams({
grant_type: "authorization_code",
code,
redirect_uri: redirectUri,
client_id: config.clientId,
client_secret: config.clientSecret,
});
if (codeVerifier) {
body.set("code_verifier", codeVerifier);
}
const response = await fetch(provider.tokenUrl, {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
Accept: "application/json",
},
body,
});
if (!response.ok) {
const error = await response.text();
throw new OAuthError("token_exchange_failed", `Token exchange failed: ${error}`);
}
const json: unknown = await response.json();
const data = z
.object({
access_token: z.string(),
id_token: z.string().optional(),
})
.parse(json);
return {
accessToken: data.access_token,
idToken: data.id_token,
};
}
/**
* Fetch user profile from OAuth provider
*/
async function fetchProfile(
provider: OAuthProvider,
accessToken: string,
providerName: string,
): Promise<OAuthProfile> {
if (!provider.userInfoUrl) {
throw new Error("Provider does not have userinfo URL");
}
const response = await fetch(provider.userInfoUrl, {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/json",
"User-Agent": "emdash-cms",
},
});
if (!response.ok) {
throw new OAuthError("profile_fetch_failed", `Failed to fetch profile: ${response.status}`);
}
const data = await response.json();
const profile = provider.parseProfile(data);
// GitHub may not return email in main profile
if (providerName === "github" && !profile.email) {
profile.email = await fetchGitHubEmail(accessToken);
}
return profile;
}
/**
* Signup policy callback.
* Return `{ allowed: true, role }` to permit signup, or `null` to deny.
*/
export type CanSelfSignup = (
email: string,
) => Promise<{ allowed: boolean; role: RoleLevel } | null>;
/**
* Find existing user or create new one (with auto-linking).
*
* Shared across all OAuth providers (GitHub, Google, AT Protocol, etc.).
* The provider-specific token exchange happens before this function is called;
* this function only deals with the EmDash user record.
*/
export async function findOrCreateOAuthUser(
adapter: AuthAdapter,
providerName: string,
profile: OAuthProfile,
canSelfSignup?: CanSelfSignup,
): Promise<User> {
// Check if OAuth account already linked
const existingAccount = await adapter.getOAuthAccount(providerName, profile.id);
if (existingAccount) {
const user = await adapter.getUserById(existingAccount.userId);
if (!user) {
throw new OAuthError("user_not_found", "Linked user not found");
}
return user;
}
// Check if user with this email exists (auto-link)
// Only auto-link when the provider has verified the email to prevent
// account takeover via unverified email on a third-party provider
const existingUser = await adapter.getUserByEmail(profile.email);
if (existingUser) {
if (!profile.emailVerified) {
throw new OAuthError(
"signup_not_allowed",
"Cannot link account: email not verified by provider",
);
}
await adapter.createOAuthAccount({
provider: providerName,
providerAccountId: profile.id,
userId: existingUser.id,
});
return existingUser;
}
// Check if self-signup is allowed
if (canSelfSignup) {
const signup = await canSelfSignup(profile.email);
if (signup?.allowed) {
// Create new user
const user = await adapter.createUser({
email: profile.email,
name: profile.name,
avatarUrl: profile.avatarUrl,
role: signup.role,
emailVerified: profile.emailVerified,
});
// Link OAuth account
await adapter.createOAuthAccount({
provider: providerName,
providerAccountId: profile.id,
userId: user.id,
});
return user;
}
}
throw new OAuthError("signup_not_allowed", "Self-signup not allowed for this email domain");
}
function getProvider(name: "github" | "google"): OAuthProvider {
switch (name) {
case "github":
return github;
case "google":
return google;
}
}
// ============================================================================
// Helpers
// ============================================================================
/**
* Generate a random state string for OAuth CSRF protection
*/
function generateState(): string {
const bytes = new Uint8Array(32);
crypto.getRandomValues(bytes);
return encodeBase64urlNoPadding(bytes);
}
function generateCodeVerifier(): string {
const bytes = new Uint8Array(32);
crypto.getRandomValues(bytes);
return encodeBase64urlNoPadding(bytes);
}
async function generateCodeChallenge(verifier: string): Promise<string> {
const bytes = new TextEncoder().encode(verifier);
const hash = sha256(bytes);
return encodeBase64urlNoPadding(hash);
}
// ============================================================================
// State storage interface
// ============================================================================
export interface StateStore {
set(state: string, data: OAuthState): Promise<void>;
get(state: string): Promise<OAuthState | null>;
delete(state: string): Promise<void>;
}
// ============================================================================
// Errors
// ============================================================================
export class OAuthError extends Error {
constructor(
public code:
| "invalid_state"
| "token_exchange_failed"
| "profile_fetch_failed"
| "user_not_found"
| "signup_not_allowed",
message: string,
) {
super(message);
this.name = "OAuthError";
}
}

View File

@@ -0,0 +1,65 @@
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { fetchGitHubEmail } from "./github.js";
describe("fetchGitHubEmail", () => {
beforeEach(() => {
vi.stubGlobal("fetch", vi.fn());
});
afterEach(() => {
vi.unstubAllGlobals();
});
it("sends User-Agent header required by GitHub API", async () => {
const mockFetch = vi.mocked(fetch);
mockFetch.mockResolvedValue(
new Response(JSON.stringify([{ email: "user@example.com", primary: true, verified: true }]), {
status: 200,
}),
);
await fetchGitHubEmail("test-token");
const [, init] = mockFetch.mock.calls[0] ?? [];
const headers = init?.headers as Record<string, string> | undefined;
expect(headers?.["User-Agent"]).toBe("emdash-cms");
});
it("returns the primary verified email", async () => {
vi.mocked(fetch).mockResolvedValue(
new Response(
JSON.stringify([
{ email: "other@example.com", primary: false, verified: true },
{ email: "primary@example.com", primary: true, verified: true },
]),
{ status: 200 },
),
);
const email = await fetchGitHubEmail("test-token");
expect(email).toBe("primary@example.com");
});
it("throws when GitHub API returns 403 (e.g. missing User-Agent)", async () => {
vi.mocked(fetch).mockResolvedValue(new Response("Forbidden", { status: 403 }));
await expect(fetchGitHubEmail("test-token")).rejects.toThrow(
"Failed to fetch GitHub emails: 403",
);
});
it("throws when no verified primary email exists", async () => {
vi.mocked(fetch).mockResolvedValue(
new Response(
JSON.stringify([{ email: "unverified@example.com", primary: true, verified: false }]),
{ status: 200 },
),
);
await expect(fetchGitHubEmail("test-token")).rejects.toThrow(
"No verified primary email found on GitHub account",
);
});
});

View File

@@ -0,0 +1,69 @@
/**
* GitHub OAuth provider
*/
import { z } from "zod";
import type { OAuthProvider, OAuthProfile } from "../types.js";
const gitHubUserSchema = z.object({
id: z.number(),
login: z.string(),
name: z.string().nullable(),
email: z.string().nullable(),
avatar_url: z.string(),
});
const gitHubEmailSchema = z.object({
email: z.string(),
primary: z.boolean(),
verified: z.boolean(),
});
export const github: OAuthProvider = {
name: "github",
authorizeUrl: "https://github.com/login/oauth/authorize",
tokenUrl: "https://github.com/login/oauth/access_token",
userInfoUrl: "https://api.github.com/user",
scopes: ["read:user", "user:email"],
parseProfile(data: unknown): OAuthProfile {
const user = gitHubUserSchema.parse(data);
return {
id: String(user.id),
email: user.email || "", // Will be fetched separately if needed
name: user.name,
avatarUrl: user.avatar_url,
emailVerified: true, // GitHub verifies emails
};
},
};
/**
* Fetch the user's primary email from GitHub
* (needed because email may not be returned in the basic user endpoint)
*/
export async function fetchGitHubEmail(accessToken: string): Promise<string> {
const response = await fetch("https://api.github.com/user/emails", {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: "application/vnd.github+json",
"X-GitHub-Api-Version": "2022-11-28",
"User-Agent": "emdash-cms",
},
});
if (!response.ok) {
throw new Error(`Failed to fetch GitHub emails: ${response.status}`);
}
const json: unknown = await response.json();
const emails = z.array(gitHubEmailSchema).parse(json);
const primary = emails.find((e) => e.primary && e.verified);
if (!primary) {
throw new Error("No verified primary email found on GitHub account");
}
return primary.email;
}

View File

@@ -0,0 +1,34 @@
/**
* Google OAuth provider (using OIDC)
*/
import { z } from "zod";
import type { OAuthProvider, OAuthProfile } from "../types.js";
const googleUserSchema = z.object({
sub: z.string(),
email: z.string(),
email_verified: z.boolean(),
name: z.string(),
picture: z.string(),
});
export const google: OAuthProvider = {
name: "google",
authorizeUrl: "https://accounts.google.com/o/oauth2/v2/auth",
tokenUrl: "https://oauth2.googleapis.com/token",
userInfoUrl: "https://openidconnect.googleapis.com/v1/userinfo",
scopes: ["openid", "email", "profile"],
parseProfile(data: unknown): OAuthProfile {
const user = googleUserSchema.parse(data);
return {
id: user.sub,
email: user.email,
name: user.name,
avatarUrl: user.picture,
emailVerified: user.email_verified,
};
},
};

View File

@@ -0,0 +1,36 @@
/**
* OAuth types
*/
export interface OAuthProfile {
id: string;
email: string;
name: string | null;
avatarUrl: string | null;
emailVerified: boolean;
}
export interface OAuthProvider {
name: string;
authorizeUrl: string;
tokenUrl: string;
userInfoUrl?: string;
scopes: string[];
/**
* Parse the user profile from the provider's response
*/
parseProfile(data: unknown): OAuthProfile;
}
export interface OAuthConfig {
clientId: string;
clientSecret: string;
}
export interface OAuthState {
provider: string;
redirectUri: string;
codeVerifier?: string; // For PKCE
nonce?: string;
}

View File

@@ -0,0 +1,213 @@
import { createHash, generateKeyPairSync, sign } from "node:crypto";
import { createAssertionSignatureMessage } from "@oslojs/webauthn";
import { describe, it, expect, vi } from "vitest";
import type { AuthAdapter, Credential } from "../types.js";
import { authenticateWithPasskey, PasskeyAuthenticationError } from "./authenticate.js";
import type { ChallengeStore } from "./types.js";
const credential: Credential = {
id: "registered-credential",
userId: "user_1",
publicKey: new Uint8Array(),
counter: 0,
deviceType: "singleDevice",
backedUp: false,
transports: [],
name: null,
createdAt: new Date(),
lastUsedAt: new Date(),
};
const config = {
rpName: "Test Site",
rpId: "localhost",
origins: ["http://localhost:4321"],
};
function createAdapter(): AuthAdapter {
return {
getCredentialById: vi.fn(async () => credential),
updateCredentialCounter: vi.fn(async () => undefined),
getUserById: vi.fn(async () => null),
} as unknown as AuthAdapter;
}
function createChallengeStore(): ChallengeStore {
return {
set: vi.fn(async () => undefined),
get: vi.fn(async () => null),
delete: vi.fn(async () => undefined),
};
}
function base64url(bytes: Uint8Array): string {
return Buffer.from(bytes).toString("base64url");
}
function createValidAssertion(opts: { rpId?: string; origin?: string } = {}) {
const rpId = opts.rpId ?? config.rpId;
const origin = opts.origin ?? config.origins[0];
if (!origin) throw new Error("origin must be defined for createValidAssertion");
const { privateKey, publicKey } = generateKeyPairSync("ec", { namedCurve: "P-256" });
const jwk = publicKey.export({ format: "jwk" });
if (typeof jwk.x !== "string" || typeof jwk.y !== "string") {
throw new Error("Failed to export test public key");
}
const publicKeyBytes = Buffer.concat([
Buffer.from([0x04]),
Buffer.from(jwk.x, "base64url"),
Buffer.from(jwk.y, "base64url"),
]);
const challenge = base64url(Buffer.from("test-challenge"));
const clientDataJSON = Buffer.from(
JSON.stringify({
type: "webauthn.get",
challenge,
origin,
}),
);
const rpIdHash = createHash("sha256").update(rpId).digest();
const signatureCounter = Buffer.alloc(4);
signatureCounter.writeUInt32BE(1);
const authenticatorData = Buffer.concat([rpIdHash, Buffer.from([0x01]), signatureCounter]);
const signatureMessage = createAssertionSignatureMessage(authenticatorData, clientDataJSON);
const signatureBytes = sign("sha256", signatureMessage, privateKey);
return {
credential: {
...credential,
publicKey: new Uint8Array(publicKeyBytes),
},
response: {
id: credential.id,
rawId: credential.id,
type: "public-key" as const,
response: {
clientDataJSON: base64url(clientDataJSON),
authenticatorData: base64url(authenticatorData),
signature: base64url(signatureBytes),
},
},
challengeStore: {
set: vi.fn(async () => undefined),
get: vi.fn(async () => ({ type: "authentication" as const, expiresAt: Date.now() + 60_000 })),
delete: vi.fn(async () => undefined),
} satisfies ChallengeStore,
};
}
describe("authenticateWithPasskey", () => {
it("throws a typed passkey auth error for malformed assertion payloads", async () => {
try {
await authenticateWithPasskey(
config,
createAdapter(),
{
id: "registered-credential",
rawId: "registered-credential",
type: "public-key",
response: {
clientDataJSON: "AA",
authenticatorData: "AA",
signature: "AA",
},
},
createChallengeStore(),
);
expect.fail("Expected passkey authentication to fail");
} catch (error) {
expect(error).toBeInstanceOf(PasskeyAuthenticationError);
expect(error).toMatchObject({ code: "invalid_response" });
}
});
it("throws a typed passkey auth error when a credential has no user", async () => {
const { credential: validCredential, response, challengeStore } = createValidAssertion();
const adapter = {
getCredentialById: vi.fn(async () => validCredential),
updateCredentialCounter: vi.fn(async () => undefined),
getUserById: vi.fn(async () => null),
} as unknown as AuthAdapter;
try {
await authenticateWithPasskey(config, adapter, response, challengeStore);
expect.fail("Expected passkey authentication to fail");
} catch (error) {
expect(error).toBeInstanceOf(PasskeyAuthenticationError);
expect(error).toMatchObject({ code: "user_not_found" });
}
});
it("rejects an origin that is not in the accepted list", async () => {
// Single-origin config; assertion arrives from a different subdomain.
const singleOriginConfig = {
rpName: "Test Site",
rpId: "example.com",
origins: ["https://example.com"],
};
const {
credential: validCredential,
response,
challengeStore,
} = createValidAssertion({
rpId: "example.com",
origin: "https://preview.example.com",
});
const adapter = {
getCredentialById: vi.fn(async () => validCredential),
updateCredentialCounter: vi.fn(async () => undefined),
getUserById: vi.fn(async () => ({ id: "user_1" })),
} as unknown as AuthAdapter;
try {
await authenticateWithPasskey(singleOriginConfig, adapter, response, challengeStore);
expect.fail("Expected origin rejection");
} catch (error) {
expect(error).toBeInstanceOf(PasskeyAuthenticationError);
expect(error).toMatchObject({ code: "invalid_origin" });
expect((error as PasskeyAuthenticationError).message).toContain(
"https://preview.example.com",
);
}
});
it("accepts an assertion from a subdomain when its origin is listed under a shared rpId", async () => {
// Reproduces emdash-cms/emdash#393 follow-up: apex + preview share rpId,
// passkey was bound to the apex but the user is hitting preview.
const multiOriginConfig = {
rpName: "Test Site",
rpId: "example.com",
origins: ["https://example.com", "https://preview.example.com"],
};
const {
credential: validCredential,
response,
challengeStore,
} = createValidAssertion({
rpId: "example.com",
origin: "https://preview.example.com",
});
const adapter = {
getCredentialById: vi.fn(async () => validCredential),
updateCredentialCounter: vi.fn(async () => undefined),
getUserById: vi.fn(async () => ({
id: "user_1",
email: "u@example.com",
name: null,
role: "admin",
})),
} as unknown as AuthAdapter;
// Should not throw — origin is in the accepted list.
const user = await authenticateWithPasskey(
multiOriginConfig,
adapter,
response,
challengeStore,
);
expect(user).toMatchObject({ id: "user_1" });
});
});

View File

@@ -0,0 +1,244 @@
/**
* Passkey authentication (credential assertion)
*
* Based on oslo webauthn documentation:
* https://webauthn.oslojs.dev/examples/authentication
*/
import {
verifyECDSASignature,
p256,
decodeSEC1PublicKey,
decodePKIXECDSASignature,
} from "@oslojs/crypto/ecdsa";
import { sha256 } from "@oslojs/crypto/sha2";
import { encodeBase64urlNoPadding, decodeBase64urlIgnorePadding } from "@oslojs/encoding";
import {
parseAuthenticatorData,
parseClientDataJSON,
ClientDataType,
createAssertionSignatureMessage,
} from "@oslojs/webauthn";
import { generateToken } from "../tokens.js";
import type { Credential, AuthAdapter, User } from "../types.js";
import type {
AuthenticationOptions,
AuthenticationResponse,
VerifiedAuthentication,
ChallengeStore,
PasskeyConfig,
} from "./types.js";
const CHALLENGE_TTL = 5 * 60 * 1000; // 5 minutes
export type PasskeyAuthenticationErrorCode =
| "credential_not_found"
| "invalid_response"
| "challenge_not_found"
| "invalid_challenge_type"
| "challenge_expired"
| "invalid_client_data_type"
| "invalid_origin"
| "invalid_rp_id_hash"
| "user_presence_not_verified"
| "invalid_signature_counter"
| "invalid_signature"
| "user_not_found";
export class PasskeyAuthenticationError extends Error {
constructor(
public code: PasskeyAuthenticationErrorCode,
message: string,
) {
super(message);
this.name = "PasskeyAuthenticationError";
}
}
function invalidPasskeyResponseError(): PasskeyAuthenticationError {
return new PasskeyAuthenticationError("invalid_response", "Invalid passkey response");
}
function decodeAuthenticationResponse(response: AuthenticationResponse) {
try {
const clientDataJSON = decodeBase64urlIgnorePadding(response.response.clientDataJSON);
const authenticatorData = decodeBase64urlIgnorePadding(response.response.authenticatorData);
const signature = decodeBase64urlIgnorePadding(response.response.signature);
const clientData = parseClientDataJSON(clientDataJSON);
return { clientDataJSON, authenticatorData, signature, clientData };
} catch {
throw invalidPasskeyResponseError();
}
}
function parseAuthenticationData(authenticatorData: Uint8Array) {
try {
return parseAuthenticatorData(authenticatorData);
} catch {
throw invalidPasskeyResponseError();
}
}
function decodeAssertionSignature(signature: Uint8Array) {
try {
return decodePKIXECDSASignature(signature);
} catch {
throw invalidPasskeyResponseError();
}
}
/**
* Generate authentication options for signing in with a passkey
*/
export async function generateAuthenticationOptions(
config: PasskeyConfig,
credentials: Credential[],
challengeStore: ChallengeStore,
): Promise<AuthenticationOptions> {
const challenge = generateToken();
// Store challenge for verification
await challengeStore.set(challenge, {
type: "authentication",
expiresAt: Date.now() + CHALLENGE_TTL,
});
return {
challenge,
rpId: config.rpId,
timeout: 60000,
userVerification: "preferred",
allowCredentials:
credentials.length > 0
? credentials.map((cred) => ({
type: "public-key" as const,
id: cred.id,
transports: cred.transports,
}))
: undefined, // Empty = allow any discoverable credential
};
}
/**
* Verify an authentication response
*/
export async function verifyAuthenticationResponse(
config: PasskeyConfig,
response: AuthenticationResponse,
credential: Credential,
challengeStore: ChallengeStore,
): Promise<VerifiedAuthentication> {
const { clientDataJSON, authenticatorData, signature, clientData } =
decodeAuthenticationResponse(response);
// Verify client data type
if (clientData.type !== ClientDataType.Get) {
throw new PasskeyAuthenticationError("invalid_client_data_type", "Invalid client data type");
}
// Verify challenge - convert Uint8Array back to base64url string (no padding, matching stored format)
const challengeString = encodeBase64urlNoPadding(clientData.challenge);
const challengeData = await challengeStore.get(challengeString);
if (!challengeData) {
throw new PasskeyAuthenticationError("challenge_not_found", "Challenge not found or expired");
}
if (challengeData.type !== "authentication") {
throw new PasskeyAuthenticationError("invalid_challenge_type", "Invalid challenge type");
}
if (challengeData.expiresAt < Date.now()) {
await challengeStore.delete(challengeString);
throw new PasskeyAuthenticationError("challenge_expired", "Challenge expired");
}
// Delete challenge (single-use)
await challengeStore.delete(challengeString);
// Verify origin against the accepted list
if (!config.origins.includes(clientData.origin)) {
throw new PasskeyAuthenticationError(
"invalid_origin",
`Invalid origin: ${clientData.origin} not in [${config.origins.join(", ")}]`,
);
}
// Parse authenticator data
const authData = parseAuthenticationData(authenticatorData);
// Verify RP ID hash
if (!authData.verifyRelyingPartyIdHash(config.rpId)) {
throw new PasskeyAuthenticationError("invalid_rp_id_hash", "Invalid RP ID hash");
}
// Verify flags
if (!authData.userPresent) {
throw new PasskeyAuthenticationError(
"user_presence_not_verified",
"User presence not verified",
);
}
// Verify counter (prevent replay attacks)
if (authData.signatureCounter !== 0 && authData.signatureCounter <= credential.counter) {
throw new PasskeyAuthenticationError(
"invalid_signature_counter",
"Invalid signature counter - possible cloned authenticator",
);
}
// Create the message that was signed
const signatureMessage = createAssertionSignatureMessage(authenticatorData, clientDataJSON);
// Ensure public key is a Uint8Array (may come as Buffer from some DB drivers)
const publicKeyBytes =
credential.publicKey instanceof Uint8Array
? credential.publicKey
: new Uint8Array(credential.publicKey);
// Decode the stored SEC1-encoded public key and verify signature
// The signature from WebAuthn is DER-encoded (PKIX format)
const ecdsaPublicKey = decodeSEC1PublicKey(p256, publicKeyBytes);
const ecdsaSignature = decodeAssertionSignature(signature);
const hash = sha256(signatureMessage);
const signatureValid = verifyECDSASignature(ecdsaPublicKey, hash, ecdsaSignature);
if (!signatureValid) {
throw new PasskeyAuthenticationError("invalid_signature", "Invalid signature");
}
return {
credentialId: response.id,
newCounter: authData.signatureCounter,
};
}
/**
* Authenticate a user with a passkey
*/
export async function authenticateWithPasskey(
config: PasskeyConfig,
adapter: AuthAdapter,
response: AuthenticationResponse,
challengeStore: ChallengeStore,
): Promise<User> {
// Find the credential
const credential = await adapter.getCredentialById(response.id);
if (!credential) {
throw new PasskeyAuthenticationError("credential_not_found", "Credential not found");
}
// Verify the response
const verified = await verifyAuthenticationResponse(config, response, credential, challengeStore);
// Update counter
await adapter.updateCredentialCounter(verified.credentialId, verified.newCounter);
// Get the user
const user = await adapter.getUserById(credential.userId);
if (!user) {
throw new PasskeyAuthenticationError("user_not_found", "User not found");
}
return user;
}

View File

@@ -0,0 +1,29 @@
/**
* Passkey authentication module
*/
export type {
RegistrationOptions,
RegistrationResponse,
VerifiedRegistration,
AuthenticationOptions,
AuthenticationResponse,
VerifiedAuthentication,
ChallengeStore,
ChallengeData,
PasskeyConfig,
} from "./types.js";
export {
generateRegistrationOptions,
verifyRegistrationResponse,
registerPasskey,
} from "./register.js";
export type { PasskeyAuthenticationErrorCode } from "./authenticate.js";
export {
PasskeyAuthenticationError,
generateAuthenticationOptions,
verifyAuthenticationResponse,
authenticateWithPasskey,
} from "./authenticate.js";

View File

@@ -0,0 +1,64 @@
import { encodeBase64urlNoPadding } from "@oslojs/encoding";
import { describe, expect, it, vi } from "vitest";
import { verifyRegistrationResponse } from "./register.js";
import type { ChallengeStore, PasskeyConfig } from "./types.js";
/**
* Locks in origin-check parity with `authenticate.ts`. The two functions
* share the same 3-line block; without this test, a divergence would slip
* through. The challenge mock satisfies the prior steps so origin verification
* is the next gate the function reaches — `attestationObject` is junk, which
* never gets parsed because the origin check fires first.
*/
const config: PasskeyConfig = {
rpName: "Test Site",
rpId: "example.com",
origins: ["https://example.com"],
};
function base64url(bytes: Uint8Array): string {
return Buffer.from(bytes).toString("base64url");
}
function makeChallengeStore(): ChallengeStore {
return {
set: vi.fn(async () => undefined),
get: vi.fn(async () => ({
type: "registration" as const,
userId: "user_1",
expiresAt: Date.now() + 60_000,
})),
delete: vi.fn(async () => undefined),
};
}
describe("verifyRegistrationResponse", () => {
it("rejects an origin not in the accepted list", async () => {
const challenge = encodeBase64urlNoPadding(new TextEncoder().encode("test-challenge"));
const clientDataJSON = Buffer.from(
JSON.stringify({
type: "webauthn.create",
challenge,
origin: "https://attacker.com",
}),
);
await expect(
verifyRegistrationResponse(
config,
{
id: "test-credential",
rawId: "test-credential",
type: "public-key",
response: {
clientDataJSON: base64url(clientDataJSON),
attestationObject: "AA",
},
},
makeChallengeStore(),
),
).rejects.toThrow(/Invalid origin: https:\/\/attacker\.com not in/);
});
});

View File

@@ -0,0 +1,232 @@
/**
* Passkey registration (credential creation)
*
* Based on oslo webauthn documentation:
* https://webauthn.oslojs.dev/examples/registration
*/
import { ECDSAPublicKey, p256 } from "@oslojs/crypto/ecdsa";
import { encodeBase64urlNoPadding, decodeBase64urlIgnorePadding } from "@oslojs/encoding";
import {
parseAttestationObject,
parseClientDataJSON,
coseAlgorithmES256,
coseAlgorithmRS256,
coseEllipticCurveP256,
ClientDataType,
AttestationStatementFormat,
COSEKeyType,
} from "@oslojs/webauthn";
import { generateToken } from "../tokens.js";
import type { Credential, NewCredential, AuthAdapter, User, DeviceType } from "../types.js";
import type {
RegistrationOptions,
RegistrationResponse,
VerifiedRegistration,
ChallengeStore,
PasskeyConfig,
} from "./types.js";
const CHALLENGE_TTL = 5 * 60 * 1000; // 5 minutes
export type { PasskeyConfig };
/**
* Generate registration options for creating a new passkey
*/
export async function generateRegistrationOptions(
config: PasskeyConfig,
user: Pick<User, "id" | "email" | "name">,
existingCredentials: Credential[],
challengeStore: ChallengeStore,
): Promise<RegistrationOptions> {
const challenge = generateToken();
// Store challenge for verification
await challengeStore.set(challenge, {
type: "registration",
userId: user.id,
expiresAt: Date.now() + CHALLENGE_TTL,
});
// Encode user ID as base64url
const userIdBytes = new TextEncoder().encode(user.id);
const userIdEncoded = encodeBase64urlNoPadding(userIdBytes);
return {
challenge,
rp: {
name: config.rpName,
id: config.rpId,
},
user: {
id: userIdEncoded,
name: user.email,
displayName: user.name || user.email,
},
pubKeyCredParams: [
{ type: "public-key", alg: coseAlgorithmES256 }, // ES256 (-7)
{ type: "public-key", alg: coseAlgorithmRS256 }, // RS256 (-257)
],
timeout: 60000,
attestation: "none", // We don't need attestation for our use case
authenticatorSelection: {
residentKey: "preferred", // Allow discoverable credentials
userVerification: "preferred",
},
excludeCredentials: existingCredentials.map((cred) => ({
type: "public-key" as const,
id: cred.id,
transports: cred.transports,
})),
};
}
/**
* Verify a registration response and extract credential data
*/
export async function verifyRegistrationResponse(
config: PasskeyConfig,
response: RegistrationResponse,
challengeStore: ChallengeStore,
): Promise<VerifiedRegistration> {
// Decode the response
const clientDataJSON = decodeBase64urlIgnorePadding(response.response.clientDataJSON);
const attestationObject = decodeBase64urlIgnorePadding(response.response.attestationObject);
// Parse client data
const clientData = parseClientDataJSON(clientDataJSON);
// Verify client data
if (clientData.type !== ClientDataType.Create) {
throw new Error("Invalid client data type");
}
// Verify challenge - convert Uint8Array back to base64url string (no padding, matching stored format)
const challengeString = encodeBase64urlNoPadding(clientData.challenge);
const challengeData = await challengeStore.get(challengeString);
if (!challengeData) {
throw new Error("Challenge not found or expired");
}
if (challengeData.type !== "registration") {
throw new Error("Invalid challenge type");
}
if (challengeData.expiresAt < Date.now()) {
await challengeStore.delete(challengeString);
throw new Error("Challenge expired");
}
// Delete challenge (single-use)
await challengeStore.delete(challengeString);
// Verify origin against the accepted list
if (!config.origins.includes(clientData.origin)) {
throw new Error(`Invalid origin: ${clientData.origin} not in [${config.origins.join(", ")}]`);
}
// Parse attestation object
const attestation = parseAttestationObject(attestationObject);
// We only support 'none' attestation for simplicity
if (attestation.attestationStatement.format !== AttestationStatementFormat.None) {
// For other formats, we'd need to verify the attestation statement
// For now, we just ignore it and trust the credential
}
const { authenticatorData } = attestation;
// Verify RP ID hash
if (!authenticatorData.verifyRelyingPartyIdHash(config.rpId)) {
throw new Error("Invalid RP ID hash");
}
// Verify flags
if (!authenticatorData.userPresent) {
throw new Error("User presence not verified");
}
// Extract credential data
if (!authenticatorData.credential) {
throw new Error("No credential data in attestation");
}
const { credential } = authenticatorData;
// Verify algorithm is supported and encode public key
// Currently only supporting ES256 (ECDSA with P-256)
const algorithm = credential.publicKey.algorithm();
let encodedPublicKey: Uint8Array;
if (algorithm === coseAlgorithmES256) {
// Verify it's EC2 key type
if (credential.publicKey.type() !== COSEKeyType.EC2) {
throw new Error("Expected EC2 key type for ES256");
}
const cosePublicKey = credential.publicKey.ec2();
if (cosePublicKey.curve !== coseEllipticCurveP256) {
throw new Error("Expected P-256 curve for ES256");
}
// Encode as SEC1 uncompressed format for storage
encodedPublicKey = new ECDSAPublicKey(
p256,
cosePublicKey.x,
cosePublicKey.y,
).encodeSEC1Uncompressed();
} else if (algorithm === coseAlgorithmRS256) {
// RSA is less common for passkeys, skip for now
throw new Error("RS256 not yet supported - please use ES256");
} else {
throw new Error(`Unsupported algorithm: ${algorithm}`);
}
// Determine device type and backup status
// Note: oslo webauthn doesn't expose backup flags, so we default to singleDevice
// In practice, most modern passkeys are multi-device (e.g., iCloud Keychain, Google Password Manager)
const deviceType: DeviceType = "singleDevice";
const backedUp = false;
return {
credentialId: response.id,
publicKey: encodedPublicKey,
counter: authenticatorData.signatureCounter,
deviceType,
backedUp,
transports: response.response.transports ?? [],
};
}
/**
* Register a new passkey for a user
*/
export async function registerPasskey(
adapter: AuthAdapter,
userId: string,
verified: VerifiedRegistration,
name?: string,
): Promise<Credential> {
// Check credential limit
const count = await adapter.countCredentialsByUserId(userId);
if (count >= 10) {
throw new Error("Maximum number of passkeys reached (10)");
}
// Check if credential already exists
const existing = await adapter.getCredentialById(verified.credentialId);
if (existing) {
throw new Error("Credential already registered");
}
const newCredential: NewCredential = {
id: verified.credentialId,
userId,
publicKey: verified.publicKey,
counter: verified.counter,
deviceType: verified.deviceType,
backedUp: verified.backedUp,
transports: verified.transports,
name,
};
return adapter.createCredential(newCredential);
}

View File

@@ -0,0 +1,126 @@
/**
* WebAuthn types for passkey authentication
*/
import type { AuthenticatorTransport, DeviceType } from "../types.js";
// ============================================================================
// Registration (Creating a new passkey)
// ============================================================================
export interface RegistrationOptions {
challenge: string; // Base64url encoded
rp: {
name: string;
id: string;
};
user: {
id: string; // Base64url encoded user ID
name: string;
displayName: string;
};
pubKeyCredParams: Array<{
type: "public-key";
alg: number; // COSE algorithm identifier
}>;
timeout?: number;
attestation?: "none" | "indirect" | "direct";
authenticatorSelection?: {
authenticatorAttachment?: "platform" | "cross-platform";
residentKey?: "discouraged" | "preferred" | "required";
requireResidentKey?: boolean;
userVerification?: "discouraged" | "preferred" | "required";
};
excludeCredentials?: Array<{
type: "public-key";
id: string; // Base64url encoded credential ID
transports?: AuthenticatorTransport[];
}>;
}
export interface RegistrationResponse {
id: string; // Base64url credential ID
rawId: string; // Base64url
type: "public-key";
response: {
clientDataJSON: string; // Base64url
attestationObject: string; // Base64url
transports?: AuthenticatorTransport[];
};
authenticatorAttachment?: "platform" | "cross-platform";
}
export interface VerifiedRegistration {
credentialId: string;
publicKey: Uint8Array;
counter: number;
deviceType: DeviceType;
backedUp: boolean;
transports: AuthenticatorTransport[];
}
// ============================================================================
// Authentication (Using an existing passkey)
// ============================================================================
export interface AuthenticationOptions {
challenge: string; // Base64url encoded
rpId: string;
timeout?: number;
userVerification?: "discouraged" | "preferred" | "required";
allowCredentials?: Array<{
type: "public-key";
id: string; // Base64url encoded credential ID
transports?: AuthenticatorTransport[];
}>;
}
export interface AuthenticationResponse {
id: string; // Base64url credential ID
rawId: string; // Base64url
type: "public-key";
response: {
clientDataJSON: string; // Base64url
authenticatorData: string; // Base64url
signature: string; // Base64url
userHandle?: string; // Base64url (user ID)
};
authenticatorAttachment?: "platform" | "cross-platform";
}
export interface VerifiedAuthentication {
credentialId: string;
newCounter: number;
}
// ============================================================================
// Challenge storage
// ============================================================================
export interface ChallengeStore {
set(challenge: string, data: ChallengeData): Promise<void>;
get(challenge: string): Promise<ChallengeData | null>;
delete(challenge: string): Promise<void>;
}
export interface ChallengeData {
type: "registration" | "authentication";
userId?: string; // For registration, the user being registered
expiresAt: number;
}
// ============================================================================
// Passkey Configuration
// ============================================================================
export interface PasskeyConfig {
rpName: string;
rpId: string;
/**
* Accepted client-data origins. The first entry is the canonical/preferred
* origin; verification accepts any entry. Multiple entries support
* deployments where the same RP is reachable under several hostnames
* sharing `rpId` (e.g. apex + preview subdomain).
*/
origins: string[];
}

View File

@@ -0,0 +1,164 @@
import { describe, it, expect } from "vitest";
import {
hasPermission,
requirePermission,
canActOnOwn,
requirePermissionOnResource,
PermissionError,
} from "./rbac.js";
import { Role } from "./types.js";
describe("rbac", () => {
describe("hasPermission", () => {
it("returns false for null user", () => {
expect(hasPermission(null, "content:read")).toBe(false);
});
it("returns false for undefined user", () => {
expect(hasPermission(undefined, "content:read")).toBe(false);
});
it("allows subscriber to read content", () => {
expect(hasPermission({ role: Role.SUBSCRIBER }, "content:read")).toBe(true);
});
it("denies subscriber from creating content", () => {
expect(hasPermission({ role: Role.SUBSCRIBER }, "content:create")).toBe(false);
});
it("allows contributor to create content", () => {
expect(hasPermission({ role: Role.CONTRIBUTOR }, "content:create")).toBe(true);
});
it("allows admin to do anything", () => {
const admin = { role: Role.ADMIN };
expect(hasPermission(admin, "content:read")).toBe(true);
expect(hasPermission(admin, "content:create")).toBe(true);
expect(hasPermission(admin, "users:manage")).toBe(true);
expect(hasPermission(admin, "schema:manage")).toBe(true);
});
it("denies editor from managing users", () => {
expect(hasPermission({ role: Role.EDITOR }, "users:manage")).toBe(false);
});
it("allows author to edit own media", () => {
expect(hasPermission({ role: Role.AUTHOR }, "media:edit_own")).toBe(true);
});
it("denies contributor from editing media", () => {
expect(hasPermission({ role: Role.CONTRIBUTOR }, "media:edit_own")).toBe(false);
});
it("allows editor to edit any media", () => {
expect(hasPermission({ role: Role.EDITOR }, "media:edit_any")).toBe(true);
});
it("denies author from editing any media", () => {
expect(hasPermission({ role: Role.AUTHOR }, "media:edit_any")).toBe(false);
});
// content:read_drafts gates non-published content reads and editor-only
// views (revisions, compare, trash, preview-url).
it("denies subscriber from reading drafts", () => {
expect(hasPermission({ role: Role.SUBSCRIBER }, "content:read_drafts")).toBe(false);
});
it("allows contributor to read drafts", () => {
expect(hasPermission({ role: Role.CONTRIBUTOR }, "content:read_drafts")).toBe(true);
});
it("allows editor to read drafts", () => {
expect(hasPermission({ role: Role.EDITOR }, "content:read_drafts")).toBe(true);
});
});
describe("requirePermission", () => {
it("throws for null user", () => {
expect(() => requirePermission(null, "content:read")).toThrow(PermissionError);
});
it("throws unauthorized for missing user", () => {
try {
requirePermission(null, "content:read");
} catch (e) {
expect(e).toBeInstanceOf(PermissionError);
expect((e as PermissionError).code).toBe("unauthorized");
}
});
it("throws forbidden for insufficient permissions", () => {
try {
requirePermission({ role: Role.SUBSCRIBER }, "content:create");
} catch (e) {
expect(e).toBeInstanceOf(PermissionError);
expect((e as PermissionError).code).toBe("forbidden");
}
});
it("does not throw for sufficient permissions", () => {
expect(() => requirePermission({ role: Role.ADMIN }, "content:create")).not.toThrow();
});
});
describe("canActOnOwn", () => {
const user = { role: Role.AUTHOR, id: "user-1" };
it("allows action on own resource with own permission", () => {
expect(canActOnOwn(user, "user-1", "content:edit_own", "content:edit_any")).toBe(true);
});
it("denies action on others resource without any permission", () => {
expect(canActOnOwn(user, "user-2", "content:edit_own", "content:edit_any")).toBe(false);
});
it("allows editor to edit any resource", () => {
const editor = { role: Role.EDITOR, id: "editor-1" };
expect(canActOnOwn(editor, "user-2", "content:edit_own", "content:edit_any")).toBe(true);
});
it("allows author to edit own media", () => {
expect(canActOnOwn(user, "user-1", "media:edit_own", "media:edit_any")).toBe(true);
});
it("denies author from editing others media", () => {
expect(canActOnOwn(user, "user-2", "media:edit_own", "media:edit_any")).toBe(false);
});
it("denies contributor from editing any media (including own)", () => {
const contributor = { role: Role.CONTRIBUTOR, id: "contrib-1" };
expect(canActOnOwn(contributor, "contrib-1", "media:edit_own", "media:edit_any")).toBe(false);
});
it("allows editor to edit any media", () => {
const editor = { role: Role.EDITOR, id: "editor-1" };
expect(canActOnOwn(editor, "user-2", "media:edit_own", "media:edit_any")).toBe(true);
});
it("F17: empty-string ownerId is not treated as 'owned by user with id ''", () => {
// A user with id="" and *:edit_own (but NOT *:edit_any) must NOT
// be able to edit content with ownerId="" — that ownerId means
// "no recorded owner" (e.g. seed-imported content), and granting
// edit-own would be an accidental privilege escalation.
const orphanedUser = { role: Role.AUTHOR, id: "" };
expect(canActOnOwn(orphanedUser, "", "content:edit_own", "content:edit_any")).toBe(false);
});
});
describe("requirePermissionOnResource", () => {
it("allows author to edit own content", () => {
const user = { role: Role.AUTHOR, id: "user-1" };
expect(() =>
requirePermissionOnResource(user, "user-1", "content:edit_own", "content:edit_any"),
).not.toThrow();
});
it("throws for author editing others content", () => {
const user = { role: Role.AUTHOR, id: "user-1" };
expect(() =>
requirePermissionOnResource(user, "user-2", "content:edit_own", "content:edit_any"),
).toThrow(PermissionError);
});
});
});

219
packages/auth/src/rbac.ts Normal file
View File

@@ -0,0 +1,219 @@
/**
* Role-Based Access Control
*/
import type { ApiTokenScope } from "./tokens.js";
import { Role, type RoleLevel } from "./types.js";
/**
* Permission definitions with minimum role required
*/
export const Permissions = {
// Content
"content:read": Role.SUBSCRIBER,
// content:read_drafts gates non-published content (drafts, scheduled, trash)
// and editor-only views (revisions, compare, preview-url). Subscribers may
// hold content:read for member-only published content but must not see
// drafts.
"content:read_drafts": Role.CONTRIBUTOR,
"content:create": Role.CONTRIBUTOR,
"content:edit_own": Role.AUTHOR,
"content:edit_any": Role.EDITOR,
"content:delete_own": Role.AUTHOR,
"content:delete_any": Role.EDITOR,
"content:publish_own": Role.AUTHOR,
"content:publish_any": Role.EDITOR,
// Media
"media:read": Role.SUBSCRIBER,
"media:upload": Role.CONTRIBUTOR,
"media:edit_own": Role.AUTHOR,
"media:edit_any": Role.EDITOR,
"media:delete_own": Role.AUTHOR,
"media:delete_any": Role.EDITOR,
// Taxonomies
"taxonomies:read": Role.SUBSCRIBER,
"taxonomies:manage": Role.EDITOR,
// Comments
"comments:read": Role.SUBSCRIBER,
"comments:moderate": Role.EDITOR,
"comments:delete": Role.ADMIN,
"comments:settings": Role.ADMIN,
// Menus
"menus:read": Role.SUBSCRIBER,
"menus:manage": Role.EDITOR,
// Widgets
"widgets:read": Role.SUBSCRIBER,
"widgets:manage": Role.EDITOR,
// Sections
"sections:read": Role.SUBSCRIBER,
"sections:manage": Role.EDITOR,
// Redirects
"redirects:read": Role.EDITOR,
"redirects:manage": Role.ADMIN,
// Users
"users:read": Role.ADMIN,
"users:invite": Role.ADMIN,
"users:manage": Role.ADMIN,
// Settings
"settings:read": Role.EDITOR,
"settings:manage": Role.ADMIN,
// Schema (content types)
"schema:read": Role.EDITOR,
"schema:manage": Role.ADMIN,
// Plugins
"plugins:read": Role.EDITOR,
"plugins:manage": Role.ADMIN,
// Import
"import:execute": Role.ADMIN,
// Search
"search:read": Role.SUBSCRIBER,
"search:manage": Role.ADMIN,
// Auth
"auth:manage_own_credentials": Role.SUBSCRIBER,
"auth:manage_connections": Role.ADMIN,
} as const;
export type Permission = keyof typeof Permissions;
/**
* Check if a user has a specific permission
*/
export function hasPermission(
user: { role: RoleLevel } | null | undefined,
permission: Permission,
): boolean {
if (!user) return false;
return user.role >= Permissions[permission];
}
/**
* Require a permission, throwing if not met
*/
export function requirePermission(
user: { role: RoleLevel } | null | undefined,
permission: Permission,
): asserts user is { role: RoleLevel } {
if (!user) {
throw new PermissionError("unauthorized", "Authentication required");
}
if (!hasPermission(user, permission)) {
throw new PermissionError("forbidden", `Missing permission: ${permission}`);
}
}
/**
* Check if user can perform action on a resource they own
*/
export function canActOnOwn(
user: { role: RoleLevel; id: string } | null | undefined,
ownerId: string,
ownPermission: Permission,
anyPermission: Permission,
): boolean {
if (!user) return false;
// Defense in depth: an empty-string ownerId means "no recorded owner"
// (e.g. seed-imported content with `authorId: null` extracted to ""),
// not "owned by an unauthenticated user". If both the user.id and the
// ownerId are "", treating them as a match would accidentally grant
// edit-own — fall through to the any-permission check instead.
if (ownerId !== "" && user.id === ownerId) {
return hasPermission(user, ownPermission);
}
return hasPermission(user, anyPermission);
}
/**
* Require permission on a resource, checking ownership
*/
export function requirePermissionOnResource(
user: { role: RoleLevel; id: string } | null | undefined,
ownerId: string,
ownPermission: Permission,
anyPermission: Permission,
): asserts user is { role: RoleLevel; id: string } {
if (!user) {
throw new PermissionError("unauthorized", "Authentication required");
}
if (!canActOnOwn(user, ownerId, ownPermission, anyPermission)) {
throw new PermissionError("forbidden", `Missing permission: ${anyPermission}`);
}
}
export class PermissionError extends Error {
constructor(
public code: "unauthorized" | "forbidden",
message: string,
) {
super(message);
this.name = "PermissionError";
}
}
// ---------------------------------------------------------------------------
// API Token Scope ↔ Role mapping
//
// Maps each API token scope to the minimum RBAC role required to hold it.
// Used at token issuance time to clamp granted scopes to the user's role.
// ---------------------------------------------------------------------------
/**
* Minimum role required for each API token scope.
*
* This is the authoritative mapping between the two authorization systems
* (RBAC roles and API token scopes). When issuing a token, the granted
* scopes must be intersected with the scopes allowed by the user's role.
*/
const SCOPE_MIN_ROLE: Record<ApiTokenScope, RoleLevel> = {
"content:read": Role.SUBSCRIBER,
"content:write": Role.CONTRIBUTOR,
"media:read": Role.SUBSCRIBER,
"media:write": Role.CONTRIBUTOR,
"schema:read": Role.EDITOR,
"schema:write": Role.ADMIN,
"taxonomies:manage": Role.EDITOR,
"menus:manage": Role.EDITOR,
"settings:read": Role.EDITOR,
"settings:manage": Role.ADMIN,
admin: Role.ADMIN,
};
/**
* Return the maximum set of API token scopes a given role level may hold.
*
* Used at token issuance time (device flow, authorization code exchange)
* to enforce: effective_scopes = requested_scopes ∩ scopesForRole(role).
*/
export function scopesForRole(role: RoleLevel): ApiTokenScope[] {
// eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- Object.entries loses tuple types; SCOPE_MIN_ROLE keys are ApiTokenScope by construction
const entries = Object.entries(SCOPE_MIN_ROLE) as [ApiTokenScope, RoleLevel][];
return entries.reduce<ApiTokenScope[]>((acc, [scope, minRole]) => {
if (role >= minRole) acc.push(scope);
return acc;
}, []);
}
/**
* Clamp a set of requested scopes to those permitted by a user's role.
*
* Returns the intersection of `requested` and the scopes the role allows.
* This is the central policy enforcement point: effective permissions =
* role permissions ∩ token scopes.
*/
export function clampScopes(requested: string[], role: RoleLevel): string[] {
const allowed = new Set<string>(scopesForRole(role));
return requested.filter((s) => allowed.has(s));
}

210
packages/auth/src/signup.ts Normal file
View File

@@ -0,0 +1,210 @@
/**
* Self-signup for allowed email domains
*/
import { escapeHtml } from "./invite.js";
import { generateTokenWithHash, hashToken } from "./tokens.js";
import type { AuthAdapter, RoleLevel, EmailMessage, User } from "./types.js";
const TOKEN_EXPIRY_MS = 15 * 60 * 1000; // 15 minutes
/** Function that sends an email (matches the EmailPipeline.send signature) */
export type EmailSendFn = (message: EmailMessage) => Promise<void>;
/**
* Add artificial delay with jitter to prevent timing attacks.
* Range approximates the time for token creation + email send.
*/
async function timingDelay(): Promise<void> {
const delay = 100 + Math.random() * 150; // 100-250ms
await new Promise((resolve) => setTimeout(resolve, delay));
}
export interface SignupConfig {
baseUrl: string;
siteName: string;
/** Optional email sender. When omitted, signup verification cannot be sent. */
email?: EmailSendFn;
}
/**
* Check if an email domain is allowed for self-signup
*/
export async function canSignup(
adapter: AuthAdapter,
email: string,
): Promise<{ allowed: boolean; role: RoleLevel } | null> {
const domain = email.split("@")[1]?.toLowerCase();
if (!domain) return null;
const allowedDomain = await adapter.getAllowedDomain(domain);
if (!allowedDomain || !allowedDomain.enabled) {
return null;
}
return {
allowed: true,
role: allowedDomain.defaultRole,
};
}
/**
* Request self-signup (sends verification email).
*
* Requires `config.email` to be set. Throws if no email sender is configured.
*/
export async function requestSignup(
config: SignupConfig,
adapter: AuthAdapter,
email: string,
): Promise<void> {
if (!config.email) {
throw new SignupError("email_not_configured", "Email is not configured");
}
// Check if user already exists
const existing = await adapter.getUserByEmail(email);
if (existing) {
// Don't reveal that user exists - add delay to match successful path timing
await timingDelay();
return;
}
// Check if domain is allowed
const signup = await canSignup(adapter, email);
if (!signup) {
// Don't reveal that domain is not allowed - add delay to match successful path timing
await timingDelay();
return;
}
// Generate token
const { token, hash } = generateTokenWithHash();
// Store token with role info
await adapter.createToken({
hash,
email,
type: "email_verify",
role: signup.role,
expiresAt: new Date(Date.now() + TOKEN_EXPIRY_MS),
});
// Build verification URL
const url = new URL("/_emdash/api/auth/signup/verify", config.baseUrl);
url.searchParams.set("token", token);
// Send email
const safeName = escapeHtml(config.siteName);
await config.email({
to: email,
subject: `Verify your email for ${config.siteName}`,
text: `Click this link to verify your email and create your account:\n\n${url.toString()}\n\nThis link expires in 15 minutes.\n\nIf you didn't request this, you can safely ignore this email.`,
html: `
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body style="font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif; line-height: 1.5; color: #333; max-width: 600px; margin: 0 auto; padding: 20px;">
<h1 style="font-size: 24px; margin-bottom: 20px;">Verify your email</h1>
<p>Click the button below to verify your email and create your ${safeName} account:</p>
<p style="margin: 30px 0;">
<a href="${url.toString()}" style="background-color: #0066cc; color: white; padding: 12px 24px; text-decoration: none; border-radius: 6px; display: inline-block;">Verify Email</a>
</p>
<p style="color: #666; font-size: 14px;">This link expires in 15 minutes.</p>
<p style="color: #666; font-size: 14px;">If you didn't request this, you can safely ignore this email.</p>
</body>
</html>`,
});
}
/**
* Validate a signup verification token
*/
export async function validateSignupToken(
adapter: AuthAdapter,
token: string,
): Promise<{ email: string; role: RoleLevel }> {
const hash = hashToken(token);
const authToken = await adapter.getToken(hash, "email_verify");
if (!authToken) {
throw new SignupError("invalid_token", "Invalid or expired verification link");
}
if (authToken.expiresAt < new Date()) {
await adapter.deleteToken(hash);
throw new SignupError("token_expired", "This link has expired");
}
if (!authToken.email || authToken.role === null) {
throw new SignupError("invalid_token", "Invalid token data");
}
return {
email: authToken.email,
role: authToken.role,
};
}
/**
* Complete signup process (after passkey registration)
*/
export async function completeSignup(
adapter: AuthAdapter,
token: string,
userData: {
name?: string;
avatarUrl?: string;
},
): Promise<User> {
const hash = hashToken(token);
// Validate token one more time
const authToken = await adapter.getToken(hash, "email_verify");
if (!authToken || authToken.expiresAt < new Date()) {
throw new SignupError("invalid_token", "Invalid or expired verification");
}
if (!authToken.email || authToken.role === null) {
throw new SignupError("invalid_token", "Invalid token data");
}
// Check user doesn't already exist
const existing = await adapter.getUserByEmail(authToken.email);
if (existing) {
await adapter.deleteToken(hash);
throw new SignupError("user_exists", "An account with this email already exists");
}
// Delete token (single-use)
await adapter.deleteToken(hash);
// Create user
const user = await adapter.createUser({
email: authToken.email,
name: userData.name,
avatarUrl: userData.avatarUrl,
role: authToken.role,
emailVerified: true,
});
return user;
}
export class SignupError extends Error {
constructor(
public code:
| "invalid_token"
| "token_expired"
| "user_exists"
| "domain_not_allowed"
| "email_not_configured",
message: string,
) {
super(message);
this.name = "SignupError";
}
}

View File

@@ -0,0 +1,141 @@
import { describe, it, expect } from "vitest";
import {
generateToken,
hashToken,
generateTokenWithHash,
generateSessionId,
generateAuthSecret,
secureCompare,
computeS256Challenge,
encrypt,
decrypt,
} from "./tokens.js";
const BASE64URL_REGEX = /^[A-Za-z0-9_-]+$/;
const NO_PADDING_REGEX = /^[A-Za-z0-9_-]+$/;
describe("tokens", () => {
describe("generateToken", () => {
it("generates a base64url-encoded token", () => {
const token = generateToken();
expect(token).toMatch(BASE64URL_REGEX);
// 32 bytes = 43 base64url characters (without padding)
expect(token.length).toBe(43);
});
it("generates unique tokens", () => {
// eslint-disable-next-line e18e/prefer-array-fill -- We need unique tokens, not the same token repeated
const tokens = new Set(Array.from({ length: 100 }, () => generateToken()));
expect(tokens.size).toBe(100);
});
});
describe("hashToken", () => {
it("produces consistent hashes", () => {
const token = generateToken();
const hash1 = hashToken(token);
const hash2 = hashToken(token);
expect(hash1).toBe(hash2);
});
it("produces different hashes for different tokens", () => {
const token1 = generateToken();
const token2 = generateToken();
expect(hashToken(token1)).not.toBe(hashToken(token2));
});
});
describe("generateTokenWithHash", () => {
it("returns both token and hash", () => {
const { token, hash } = generateTokenWithHash();
expect(token).toBeDefined();
expect(hash).toBeDefined();
expect(hashToken(token)).toBe(hash);
});
});
describe("generateSessionId", () => {
it("generates a shorter session ID", () => {
const sessionId = generateSessionId();
expect(sessionId).toMatch(BASE64URL_REGEX);
// 20 bytes = 27 base64url characters
expect(sessionId.length).toBe(27);
});
});
describe("generateAuthSecret", () => {
it("generates a 32-byte secret", () => {
const secret = generateAuthSecret();
expect(secret).toMatch(BASE64URL_REGEX);
expect(secret.length).toBe(43);
});
});
describe("secureCompare", () => {
it("returns true for equal strings", () => {
expect(secureCompare("hello", "hello")).toBe(true);
});
it("returns false for different strings", () => {
expect(secureCompare("hello", "world")).toBe(false);
});
it("returns false for different length strings", () => {
expect(secureCompare("hello", "hello!")).toBe(false);
});
});
describe("computeS256Challenge", () => {
it("produces correct S256 challenge for a known verifier", () => {
// RFC 7636 Appendix B test vector:
// verifier = "dBjftJeZ4CVP-mB92K27uhbUJU1p1r_wW1gFWFOEjXk"
// expected challenge = "E9Melhoa2OwvFrEMTJguCHaoeK1t8URWbuGJSstw-cM"
const challenge = computeS256Challenge("dBjftJeZ4CVP-mB92K27uhbUJU1p1r_wW1gFWFOEjXk");
expect(challenge).toBe("E9Melhoa2OwvFrEMTJguCHaoeK1t8URWbuGJSstw-cM");
});
it("produces base64url output without padding", () => {
const challenge = computeS256Challenge("test-verifier-string");
expect(challenge).toMatch(NO_PADDING_REGEX);
expect(challenge).not.toContain("=");
});
it("is deterministic", () => {
const a = computeS256Challenge("same-input");
const b = computeS256Challenge("same-input");
expect(a).toBe(b);
});
it("produces different output for different input", () => {
const a = computeS256Challenge("verifier-one");
const b = computeS256Challenge("verifier-two");
expect(a).not.toBe(b);
});
});
describe("encrypt/decrypt", () => {
const secret = generateAuthSecret();
it("encrypts and decrypts a string", async () => {
const plaintext = "my-secret-value";
const encrypted = await encrypt(plaintext, secret);
const decrypted = await decrypt(encrypted, secret);
expect(decrypted).toBe(plaintext);
});
it("produces different ciphertext each time (due to random IV)", async () => {
const plaintext = "my-secret-value";
const encrypted1 = await encrypt(plaintext, secret);
const encrypted2 = await encrypt(plaintext, secret);
expect(encrypted1).not.toBe(encrypted2);
});
it("fails to decrypt with wrong secret", async () => {
const plaintext = "my-secret-value";
const encrypted = await encrypt(plaintext, secret);
const wrongSecret = generateAuthSecret();
await expect(decrypt(encrypted, wrongSecret)).rejects.toThrow();
});
});
});

266
packages/auth/src/tokens.ts Normal file
View File

@@ -0,0 +1,266 @@
/**
* Secure token utilities
*
* Crypto via Oslo.js (@oslojs/crypto). Base64url via @oslojs/encoding.
*
* Tokens are opaque random values. We store only the SHA-256 hash in the database.
*/
import { hmac } from "@oslojs/crypto/hmac";
import { sha256, SHA256 } from "@oslojs/crypto/sha2";
import { constantTimeEqual } from "@oslojs/crypto/subtle";
import { encodeBase64urlNoPadding, decodeBase64urlIgnorePadding } from "@oslojs/encoding";
const TOKEN_BYTES = 32; // 256 bits of entropy
// ---------------------------------------------------------------------------
// API Token Prefixes
// ---------------------------------------------------------------------------
/** Valid API token prefixes */
export const TOKEN_PREFIXES = {
PAT: "ec_pat_",
OAUTH_ACCESS: "ec_oat_",
OAUTH_REFRESH: "ec_ort_",
} as const;
// ---------------------------------------------------------------------------
// Scopes
// ---------------------------------------------------------------------------
/** All valid API token scopes */
export const VALID_SCOPES = [
"content:read",
"content:write",
"media:read",
"media:write",
"schema:read",
"schema:write",
"taxonomies:manage",
"menus:manage",
"settings:read",
"settings:manage",
"admin",
] as const;
export type ApiTokenScope = (typeof VALID_SCOPES)[number];
/**
* Validate that scopes are all valid.
* Returns the invalid scopes, or empty array if all valid.
*/
export function validateScopes(scopes: string[]): string[] {
const validSet = new Set<string>(VALID_SCOPES);
return scopes.filter((s) => !validSet.has(s));
}
/**
* Scope grants — when a token holds the key scope, it implicitly grants
* the listed scopes too. This keeps existing tokens working when more
* granular scopes are introduced.
*
* Specifically, `content:write` was historically the only scope we checked
* for menu and taxonomy mutations. After splitting those out into
* `menus:manage` and `taxonomies:manage`, existing PATs with `content:write`
* continue to work via this grant table.
*
* Lookup is one-hop — chaining (`A → B → C`) is NOT supported. If a chain
* is needed, expand the values explicitly. Backed by a `Map` rather than a
* plain object so prototype-chain keys (`__proto__`, `constructor`, etc.)
* can't smuggle non-array values through bracket access.
*/
const IMPLICIT_SCOPE_GRANTS = new Map<string, readonly string[]>([
["content:write", ["menus:manage", "taxonomies:manage"]],
]);
/**
* Check if a set of scopes includes a required scope.
*
* The `admin` scope grants access to everything. `content:write` implicitly
* grants `menus:manage` and `taxonomies:manage` to preserve backwards
* compatibility with PATs issued before those scopes were split out.
*/
export function hasScope(scopes: string[], required: string): boolean {
if (scopes.includes("admin")) return true;
if (scopes.includes(required)) return true;
for (const held of scopes) {
const granted = IMPLICIT_SCOPE_GRANTS.get(held);
if (granted?.includes(required)) return true;
}
return false;
}
/**
* Generate a cryptographically secure random token
* Returns base64url-encoded string (URL-safe)
*/
export function generateToken(): string {
const bytes = new Uint8Array(TOKEN_BYTES);
crypto.getRandomValues(bytes);
return encodeBase64urlNoPadding(bytes);
}
/**
* Hash a token for storage
* We never store raw tokens - only their SHA-256 hash
*/
export function hashToken(token: string): string {
const bytes = decodeBase64urlIgnorePadding(token);
const hash = sha256(bytes);
return encodeBase64urlNoPadding(hash);
}
/**
* Generate a token and its hash together
*/
export function generateTokenWithHash(): { token: string; hash: string } {
const token = generateToken();
const hash = hashToken(token);
return { token, hash };
}
/**
* Generate a session ID (shorter, for cookie storage)
*/
export function generateSessionId(): string {
const bytes = new Uint8Array(20); // 160 bits
crypto.getRandomValues(bytes);
return encodeBase64urlNoPadding(bytes);
}
/**
* Generate an auth secret for configuration
*/
export function generateAuthSecret(): string {
const bytes = new Uint8Array(32);
crypto.getRandomValues(bytes);
return encodeBase64urlNoPadding(bytes);
}
// ---------------------------------------------------------------------------
// Prefixed API tokens (ec_pat_, ec_oat_, ec_ort_)
// ---------------------------------------------------------------------------
/**
* Generate a prefixed API token and its hash.
* Returns the raw token (shown once to the user), the hash (stored server-side),
* and a display prefix (for identification in UIs/logs).
*
* Uses oslo/crypto for SHA-256 hashing.
*/
export function generatePrefixedToken(prefix: string): {
raw: string;
hash: string;
prefix: string;
} {
const bytes = new Uint8Array(TOKEN_BYTES);
crypto.getRandomValues(bytes);
const encoded = encodeBase64urlNoPadding(bytes);
const raw = `${prefix}${encoded}`;
const hash = hashPrefixedToken(raw);
// First few chars for identification in UIs
const displayPrefix = raw.slice(0, prefix.length + 4);
return { raw, hash, prefix: displayPrefix };
}
/**
* Hash a prefixed API token for storage/lookup.
* Hashes the full prefixed token string via SHA-256, returns base64url (no padding).
*/
export function hashPrefixedToken(token: string): string {
const bytes = new TextEncoder().encode(token);
const hash = sha256(bytes);
return encodeBase64urlNoPadding(hash);
}
// ---------------------------------------------------------------------------
// PKCE (RFC 7636) — server-side verification
// ---------------------------------------------------------------------------
/**
* Compute an S256 PKCE code challenge from a code verifier.
* Used server-side to verify that code_verifier matches the stored code_challenge.
*
* Equivalent to: BASE64URL(SHA256(ASCII(code_verifier)))
*/
export function computeS256Challenge(codeVerifier: string): string {
const hash = sha256(new TextEncoder().encode(codeVerifier));
return encodeBase64urlNoPadding(hash);
}
/**
* Constant-time comparison to prevent timing attacks
*/
export function secureCompare(a: string, b: string): boolean {
const text = new TextEncoder();
const salt = crypto.getRandomValues(new Uint8Array(TOKEN_BYTES));
const hash = (str: string) => hmac(SHA256, salt, text.encode(str));
return constantTimeEqual(hash(a), hash(b));
}
// ============================================================================
// Encryption utilities (for storing OAuth secrets)
// ============================================================================
const ALGORITHM = "AES-GCM";
const IV_BYTES = 12;
/**
* Derive an encryption key from the auth secret
*/
async function deriveKey(secret: string): Promise<CryptoKey> {
const decoded = decodeBase64urlIgnorePadding(secret);
// Create a new ArrayBuffer to ensure compatibility with crypto.subtle
const buffer = new Uint8Array(decoded).buffer;
const keyMaterial = await crypto.subtle.importKey("raw", buffer, "PBKDF2", false, ["deriveKey"]);
return crypto.subtle.deriveKey(
{
name: "PBKDF2",
salt: new TextEncoder().encode("emdash-auth-v1"),
iterations: 100000,
hash: "SHA-256",
},
keyMaterial,
{ name: ALGORITHM, length: 256 },
false,
["encrypt", "decrypt"],
);
}
/**
* Encrypt a value using AES-GCM
*/
export async function encrypt(plaintext: string, secret: string): Promise<string> {
const key = await deriveKey(secret);
const iv = crypto.getRandomValues(new Uint8Array(IV_BYTES));
const encoded = new TextEncoder().encode(plaintext);
const ciphertext = await crypto.subtle.encrypt({ name: ALGORITHM, iv }, key, encoded);
// Prepend IV to ciphertext
const combined = new Uint8Array(iv.length + ciphertext.byteLength);
combined.set(iv);
combined.set(new Uint8Array(ciphertext), iv.length);
return encodeBase64urlNoPadding(combined);
}
/**
* Decrypt a value encrypted with encrypt()
*/
export async function decrypt(encrypted: string, secret: string): Promise<string> {
const key = await deriveKey(secret);
const combined = decodeBase64urlIgnorePadding(encrypted);
const iv = combined.slice(0, IV_BYTES);
const ciphertext = combined.slice(IV_BYTES);
const decrypted = await crypto.subtle.decrypt({ name: ALGORITHM, iv }, key, ciphertext);
return new TextDecoder().decode(decrypted);
}

352
packages/auth/src/types.ts Normal file
View File

@@ -0,0 +1,352 @@
/**
* Core types for @emdash-cms/auth
*/
// ============================================================================
// Roles & Permissions
// ============================================================================
export const Role = {
SUBSCRIBER: 10,
CONTRIBUTOR: 20,
AUTHOR: 30,
EDITOR: 40,
ADMIN: 50,
} as const;
export type RoleLevel = (typeof Role)[keyof typeof Role];
export type RoleName = keyof typeof Role;
export function roleFromLevel(level: number): RoleName | undefined {
const entry = Object.entries(Role).find(([, v]) => v === level);
if (!entry) return undefined;
const name = entry[0];
if (isRoleName(name)) return name;
return undefined;
}
function isRoleName(value: string): value is RoleName {
return value in Role;
}
const ROLE_LEVEL_MAP = new Map<number, RoleLevel>(Object.values(Role).map((v) => [v, v]));
export function toRoleLevel(value: number): RoleLevel {
const level = ROLE_LEVEL_MAP.get(value);
if (level !== undefined) return level;
throw new Error(`Invalid role level: ${value}`);
}
const DEVICE_TYPE_MAP: Record<string, DeviceType | undefined> = {
singleDevice: "singleDevice",
multiDevice: "multiDevice",
};
export function toDeviceType(value: string): DeviceType {
const dt = DEVICE_TYPE_MAP[value];
if (dt !== undefined) return dt;
throw new Error(`Invalid device type: ${value}`);
}
const TOKEN_TYPE_MAP: Record<string, TokenType | undefined> = {
magic_link: "magic_link",
email_verify: "email_verify",
invite: "invite",
recovery: "recovery",
};
export function toTokenType(value: string): TokenType {
const tt = TOKEN_TYPE_MAP[value];
if (tt !== undefined) return tt;
throw new Error(`Invalid token type: ${value}`);
}
export function roleToLevel(name: RoleName): RoleLevel {
return Role[name];
}
// ============================================================================
// User
// ============================================================================
export interface User {
id: string;
email: string;
name: string | null;
avatarUrl: string | null;
role: RoleLevel;
emailVerified: boolean;
disabled: boolean;
data: Record<string, unknown> | null;
createdAt: Date;
updatedAt: Date;
}
export interface NewUser {
email: string;
name?: string | null;
avatarUrl?: string | null;
role?: RoleLevel;
emailVerified?: boolean;
data?: Record<string, unknown> | null;
}
export interface UpdateUser {
email?: string;
name?: string | null;
avatarUrl?: string | null;
role?: RoleLevel;
emailVerified?: boolean;
disabled?: boolean;
data?: Record<string, unknown> | null;
}
// ============================================================================
// Credentials (Passkeys)
// ============================================================================
export type AuthenticatorTransport = "usb" | "nfc" | "ble" | "internal" | "hybrid";
export type DeviceType = "singleDevice" | "multiDevice";
export interface Credential {
id: string; // Base64url credential ID
userId: string;
publicKey: Uint8Array; // COSE public key
counter: number;
deviceType: DeviceType;
backedUp: boolean;
transports: AuthenticatorTransport[];
name: string | null;
createdAt: Date;
lastUsedAt: Date;
}
export interface NewCredential {
id: string;
userId: string;
publicKey: Uint8Array;
counter: number;
deviceType: DeviceType;
backedUp: boolean;
transports: AuthenticatorTransport[];
name?: string | null;
}
// ============================================================================
// Sessions
// ============================================================================
export interface Session {
id: string;
userId: string;
expiresAt: Date;
ipAddress: string | null;
userAgent: string | null;
createdAt: Date;
}
export interface SessionData {
userId: string;
expiresAt: number; // Unix timestamp
}
// ============================================================================
// Auth Tokens (magic links, invites, etc.)
// ============================================================================
export type TokenType = "magic_link" | "email_verify" | "invite" | "recovery";
export interface AuthToken {
hash: string; // SHA-256 hash of the raw token
userId: string | null; // null for pre-user tokens (invite/signup)
email: string | null; // For pre-user tokens
type: TokenType;
role: RoleLevel | null; // For invites
invitedBy: string | null;
expiresAt: Date;
createdAt: Date;
}
export interface NewAuthToken {
hash: string;
userId?: string | null;
email?: string | null;
type: TokenType;
role?: RoleLevel | null;
invitedBy?: string | null;
expiresAt: Date;
}
// ============================================================================
// OAuth Accounts
// ============================================================================
export interface OAuthAccount {
provider: string;
providerAccountId: string;
userId: string;
createdAt: Date;
}
export interface NewOAuthAccount {
provider: string;
providerAccountId: string;
userId: string;
}
// ============================================================================
// OAuth Connections (SSO config)
// ============================================================================
export interface OAuthConnection {
id: string;
name: string;
provider: "oidc" | "github" | "google";
clientId: string;
clientSecretEnc: string; // Encrypted
issuerUrl: string | null;
config: Record<string, unknown> | null;
enabled: boolean;
createdAt: Date;
}
// ============================================================================
// OAuth Clients (when EmDash is provider)
// ============================================================================
export interface OAuthClient {
id: string;
name: string;
secretHash: string;
redirectUris: string[];
scopes: string[];
createdAt: Date;
}
// ============================================================================
// Allowed Domains (self-signup)
// ============================================================================
export interface AllowedDomain {
domain: string;
defaultRole: RoleLevel;
enabled: boolean;
createdAt: Date;
}
// ============================================================================
// User Listing Types (for admin UI)
// ============================================================================
/** Extended user with list view computed fields */
export interface UserListItem extends User {
lastLogin: Date | null;
credentialCount: number;
oauthProviders: string[];
}
/** User with full details including related data */
export interface UserWithDetails {
user: User;
credentials: Credential[];
oauthAccounts: OAuthAccount[];
lastLogin: Date | null;
}
// ============================================================================
// Auth Adapter Interface
// ============================================================================
export interface AuthAdapter {
// Users
getUserById(id: string): Promise<User | null>;
getUserByEmail(email: string): Promise<User | null>;
createUser(user: NewUser): Promise<User>;
updateUser(id: string, data: UpdateUser): Promise<void>;
deleteUser(id: string): Promise<void>;
countUsers(): Promise<number>;
// User listing and details (for admin)
getUsers(options?: {
search?: string;
role?: number;
cursor?: string;
limit?: number;
}): Promise<{ items: UserListItem[]; nextCursor?: string }>;
getUserWithDetails(id: string): Promise<UserWithDetails | null>;
countAdmins(): Promise<number>;
// Credentials
getCredentialById(id: string): Promise<Credential | null>;
getCredentialsByUserId(userId: string): Promise<Credential[]>;
createCredential(credential: NewCredential): Promise<Credential>;
updateCredentialCounter(id: string, counter: number): Promise<void>;
updateCredentialName(id: string, name: string | null): Promise<void>;
deleteCredential(id: string): Promise<void>;
countCredentialsByUserId(userId: string): Promise<number>;
// Auth Tokens
createToken(token: NewAuthToken): Promise<void>;
getToken(hash: string, type: TokenType): Promise<AuthToken | null>;
deleteToken(hash: string): Promise<void>;
deleteExpiredTokens(): Promise<void>;
// OAuth Accounts
getOAuthAccount(provider: string, providerAccountId: string): Promise<OAuthAccount | null>;
getOAuthAccountsByUserId(userId: string): Promise<OAuthAccount[]>;
createOAuthAccount(account: NewOAuthAccount): Promise<OAuthAccount>;
deleteOAuthAccount(provider: string, providerAccountId: string): Promise<void>;
// Allowed Domains
getAllowedDomain(domain: string): Promise<AllowedDomain | null>;
getAllowedDomains(): Promise<AllowedDomain[]>;
createAllowedDomain(domain: string, defaultRole: RoleLevel): Promise<AllowedDomain>;
updateAllowedDomain(domain: string, enabled: boolean, defaultRole?: RoleLevel): Promise<void>;
deleteAllowedDomain(domain: string): Promise<void>;
}
// ============================================================================
// Email Adapter Interface
// ============================================================================
export interface EmailMessage {
to: string;
subject: string;
text: string;
html?: string;
}
export interface EmailAdapter {
send(message: EmailMessage): Promise<void>;
}
// ============================================================================
// Auth Errors
// ============================================================================
export class AuthError extends Error {
constructor(
public code: AuthErrorCode,
message?: string,
) {
super(message ?? code);
this.name = "AuthError";
}
}
export type AuthErrorCode =
| "invalid_credentials"
| "invalid_token"
| "token_expired"
| "user_not_found"
| "user_exists"
| "credential_exists"
| "max_credentials"
| "email_not_verified"
| "signup_not_allowed"
| "domain_not_allowed"
| "forbidden"
| "unauthorized"
| "rate_limited"
| "invalid_request"
| "internal_error";

View File

@@ -0,0 +1,19 @@
{
"compilerOptions": {
"target": "ES2024",
"module": "preserve",
"moduleResolution": "bundler",
"declaration": true,
"declarationMap": true,
"strict": true,
"noUncheckedIndexedAccess": true,
"noImplicitOverride": true,
"skipLibCheck": true,
"esModuleInterop": true,
"isolatedModules": true,
"verbatimModuleSyntax": true,
"outDir": "dist",
"rootDir": "src"
},
"include": ["src"]
}

View File

@@ -0,0 +1,14 @@
import { defineConfig } from "tsdown";
export default defineConfig({
entry: [
"src/index.ts",
"src/passkey/index.ts",
"src/adapters/kysely.ts",
"src/oauth/providers/github.ts",
"src/oauth/providers/google.ts",
],
format: "esm",
dts: true,
clean: true,
});