Client Middleware
The Orphnet Logging client SDK provides Hono middleware for automatic HTTP logging, a standalone logger for any JavaScript runtime, and multiple transports. This page is a cookbook covering all major use cases.
For the full API reference, see the client SDK README.
Quick Setup with FetchTransport
The simplest setup: add middleware to a Hono app and every request is automatically logged.
import { Hono } from 'hono'
import { createLoggingMiddleware } from '@orphnet/logging-api-client'
const app = new Hono()
app.use('*', createLoggingMiddleware({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_workspace_key', // Authorization: Bearer sk_...
}))
app.get('/', (c) => c.text('Hello'))
export default appEach request logs an entry with type request (category: http), message like GET / 200 12ms, and data containing { method, path, status, duration }.
BatchTransport for High-Throughput
Queue log entries and flush in batches to reduce HTTP requests:
import { BatchTransport, createLoggingMiddleware } from '@orphnet/logging-api-client'
const batchTransport = new BatchTransport({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
maxBatchSize: 25, // Flush when 25 entries queued (default: 10)
flushIntervalMs: 5000, // Flush every 5 seconds (default: 5000)
onError: (err) => console.error('Batch send failed:', err),
onBatchFlush: (entries) => console.log(`Sending ${entries.length} entries`),
})
app.use('*', createLoggingMiddleware({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
transport: batchTransport,
}))
// Flush remaining logs before shutdown
process.on('SIGTERM', () => batchTransport.dispose())The BatchTransport uses a splice-based queue drain for atomic flush -- entries are removed from the queue before the HTTP request, preventing duplicates if the flush fails.
ConsoleTransport for Development
Output logs to the console without making network calls:
import { ConsoleTransport, createLoggingMiddleware } from '@orphnet/logging-api-client'
app.use('*', createLoggingMiddleware({
apiUrl: 'http://localhost:8787',
apiKey: 'sk_dev_key',
transport: new ConsoleTransport(),
}))Output format maps log levels to console methods:
level: 'error'usesconsole.error('[type] message')level: 'debug'usesconsole.debug('[type] message')level: 'info'usesconsole.log('[type] message')
Data objects are passed as the second argument when present.
Hono Auto-Logging Middleware
The middleware configuration controls automatic request logging:
| Option | Type | Default | Description |
|---|---|---|---|
apiUrl | string | -- | Base URL of the Orphnet Logging API |
apiKey | string | -- | API key for authentication |
autoLog | boolean | true | Automatically log HTTP requests |
logHeaders | boolean | false | Include request headers in log data |
logQueryParams | boolean | false | Include query parameters in log data |
logRequestBody | boolean | false | Include request body in log data |
logResponseBody | boolean | false | Include response body in log data |
excludePaths | string[] | [] | Path prefixes to skip auto-logging |
transport | LogTransport | FetchTransport | Custom transport implementation |
app.use('*', createLoggingMiddleware({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
autoLog: true,
logHeaders: false,
logQueryParams: true,
excludePaths: ['/health', '/favicon.ico'],
}))On Cloudflare Workers, the middleware uses ctx.waitUntil() to send logs without blocking the response. On Node or Bun, logs are sent as fire-and-forget promises.
Standalone Logger
Use createLogger outside of Hono for any JavaScript/TypeScript application:
import { createLogger, FetchTransport } from '@orphnet/logging-api-client'
const transport = new FetchTransport({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
})
const logger = createLogger(transport, 'project-id', 'session-id')
logger.ai('inference', { message: 'Model response received' })
logger.http('request', { message: 'External API called', data: { url: 'https://...' } })
logger.system('info', { message: 'Service initialized' })
logger.custom('billing', { message: 'Invoice created', data: { amount: 99 } })Logger Methods
| Method | Category | Description |
|---|---|---|
logger.ai(type, payload) | ai | AI/LLM tracing |
logger.http(type, payload) | http | HTTP request/response logging |
logger.system(type, payload) | system | General-purpose logging |
logger.custom(type, payload) | custom | Any custom type string |
Each method accepts a type string and a LogPayload:
interface LogPayload {
message: string
data?: Record<string, unknown>
}Lifecycle Hooks
Hooks allow you to intercept, filter, and monitor log entries:
onBeforeSend
Transform or filter entries before they are sent:
app.use('*', createLoggingMiddleware({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
onBeforeSend: (entry) => {
// Filter out debug logs in production
if (entry.level === 'debug') return null
// Redact sensitive data
if (entry.data?.headers) {
delete entry.data.headers
}
return entry
},
}))Return null to drop the entry entirely.
onSuccess
Called after a log entry is successfully sent:
onSuccess: (entry) => {
metrics.increment('logs.sent', { type: entry.type })
}onError
Called when a transport error occurs:
onError: (err, entry) => {
console.error(`Failed to send log [${entry.type}]:`, err.message)
// Report to monitoring (Sentry, Datadog, etc.)
}Transport errors are never thrown -- they are always routed to this callback.
onBatchFlush
Called when BatchTransport flushes a batch:
onBatchFlush: (entries) => {
console.log(`Flushed ${entries.length} log entries`)
metrics.gauge('logs.batch_size', entries.length)
}Combined Example
app.use('*', createLoggingMiddleware({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
onBeforeSend: (entry) => {
if (entry.level === 'debug') return null
return entry
},
onSuccess: (entry) => {
metrics.increment('logs.sent')
},
onError: (err) => {
metrics.increment('logs.failed')
console.error('[logging-sdk]', err.message)
},
}))Workspace Auth Examples
Workspace-Scoped Key
A single key for all projects in a workspace:
const config = {
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_ws_abc123', // Workspace key
}Project-Scoped Key
A key limited to a single project:
const config = {
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_proj_xyz789', // Project key
}API keys are sent as Authorization: Bearer sk_... on every request.
Custom Categories
Use the custom() method for domain-specific log types:
app.post('/api/checkout', async (c) => {
const logger = c.get('logger')
logger.custom('billing', {
message: 'Payment processed',
data: { amount: 49.99, currency: 'USD' },
})
logger.custom('audit', {
message: 'Order created',
data: { orderId: 'ord_123' },
})
return c.json({ ok: true })
})Custom types are stored with category: "custom" and can be filtered by type in queries.
Error Handling Patterns
Graceful Degradation
Logging should never break your application. The SDK is designed for graceful degradation:
app.use('*', createLoggingMiddleware({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
onError: (err) => {
// Log locally but don't crash
console.error('[logging-sdk] Transport error:', err.message)
},
}))Retry with BatchTransport
BatchTransport automatically retries on flush failure. Configure error handling:
const transport = new BatchTransport({
apiUrl: 'https://api.logvista.orph.dev',
apiKey: 'sk_your_key',
maxBatchSize: 50,
flushIntervalMs: 10000,
onError: (err) => {
// Entries remain in queue for next flush attempt
console.error('Batch flush failed, will retry:', err.message)
},
})Testing with Mocks
Using ConsoleTransport
import { ConsoleTransport, createLoggingMiddleware } from '@orphnet/logging-api-client'
const testApp = new Hono()
testApp.use('*', createLoggingMiddleware({
apiUrl: 'http://localhost:8787',
apiKey: 'sk_test',
transport: new ConsoleTransport(),
}))Custom Mock Transport
Create a transport that captures entries for assertions:
import type { LogTransport, LogEntry } from '@orphnet/logging-api-client'
class MockTransport implements LogTransport {
entries: LogEntry[] = []
send(entry: LogEntry) {
this.entries.push(entry)
}
clear() {
this.entries = []
}
}
// In tests
const mock = new MockTransport()
app.use('*', createLoggingMiddleware({
apiUrl: 'http://localhost:8787',
apiKey: 'sk_test',
transport: mock,
}))
// After request
expect(mock.entries).toHaveLength(1)
expect(mock.entries[0].type).toBe('request')TypeScript Types
| Type | Description |
|---|---|
LoggingMiddlewareConfig | Full configuration for Hono middleware |
BatchTransportConfig | Configuration for BatchTransport |
LifecycleHooks | Optional hooks: onBeforeSend, onSuccess, onError, onBatchFlush |
LogTransport | Interface for custom transports (send(entry, ctx?)) |
LogEntry | Single log entry: projectId, sessionId, message, level, type, data, timestamp |
LogPayload | Simplified payload: message + optional data |
OrphnetLogger | Logger interface: ai(), http(), system(), custom() methods |
Next Steps
- Log Categories -- Built-in and custom log types
- API Reference: Ingest -- Log ingestion endpoint
- Getting Started -- Quick start guide