Storage Operations
This guide explains the core storage concepts and provides examples of how to use the Synapse SDK to store, retrieve, and manage data on Filecoin On-Chain Cloud.
Key Concepts
Section titled “Key Concepts”Data Set: A logical container of pieces stored with one provider. When a data set is created, a payment rail is established with that provider. All pieces in the data set share this single payment rail and are verified together via PDP proofs.
PieceCID: Content-addressed identifier for your data (format: bafkzcib...). Automatically calculated during upload and used to retrieve data from any provider.
Metadata: Optional key-value pairs for organization:
- Data Set Metadata: Max 10 keys (e.g.,
project,environment) - Piece Metadata: Max 5 keys per piece (e.g.,
filename,contentType)
Storage Manager: The main entry point for storage operations. Handles provider selection, data set management, and provides downloads from any provider (provider-agnostic) using the StorageContext.
Storage Context: A connection to a specific storage provider and data set. Created explicitly for fine-grained control or automatically by StorageManager. Enables uploads and downloads with the specific storage provider.
Storage Approaches
Section titled “Storage Approaches”The SDK offers two ways to work with storage operations:
| Approach | Who It’s For | What SDK Handles | When to Use |
|---|---|---|---|
| Auto-Managed | Most developers | Provider selection, data set creation, management | Getting started, simple apps, quick prototypes |
| Explicit Control | Advanced users | Nothing - you control everything | Batch operations, specific providers, cost optimization |
Recommendation: Start with auto-managed, then explore explicit control only if needed.
Example 1: Quick Start (Auto-Managed)
Section titled “Example 1: Quick Start (Auto-Managed)”Upload and download data with zero configuration - SDK automatically selects a provider and manages the data set:
const const data: Uint8Array<ArrayBuffer>
data = new var Uint8Array: Uint8ArrayConstructornew (elements: Iterable<number>) => Uint8Array<ArrayBuffer> (+6 overloads)
Uint8Array([1, 2, 3, 4, 5]);
const const result: UploadResult
result = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.upload(data: Uint8Array | ReadableStream<Uint8Array>, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload(const data: Uint8Array<ArrayBuffer>
data);const const downloaded: Uint8Array<ArrayBufferLike>
downloaded = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.download(pieceCid: string | PieceCID, options?: StorageManagerDownloadOptions): Promise<Uint8Array>
download(const result: UploadResult
result.UploadResult.pieceCid: PieceLink
pieceCid);
var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Uploaded:", const result: UploadResult
result.UploadResult.pieceCid: PieceLink
pieceCid);var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Downloaded:", const downloaded: Uint8Array<ArrayBufferLike>
downloaded.Uint8Array<ArrayBufferLike>.length: number
The length of the array.
length, "bytes");Example 2: With Metadata (Auto-Managed)
Section titled “Example 2: With Metadata (Auto-Managed)”Add metadata to organize uploads and enable faster data set reuse - SDK will reuse any existing data set matching the metadata:
const const context: StorageContext
context = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ StorageServiceOptions.metadata?: Record<string, string>
metadata: { type Application: string
Application: "My DApp", type Version: string
Version: "1.0.0", type Category: string
Category: "Documents", },});
const const result: UploadResult
result = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.upload(data: Uint8Array | ReadableStream<Uint8Array>, options?: StorageManagerUploadOptions): Promise<UploadResult>
upload(const data: Uint8Array<ArrayBufferLike>
data, { StorageManagerUploadOptions.context?: StorageContext
context });var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Uploaded:", const result: UploadResult
result.UploadResult.pieceCid: PieceLink
pieceCid);Data Set Management
Section titled “Data Set Management”Data sets are automatically created during your first upload to a provider. For explicit management of data sets, use these operations:
When You Need Explicit Data Sets:
- Uploading many files to same provider
- Want consistent provider for your application
- Need to track costs per data set
- Building batch upload workflows
Getting all data sets
Section titled “Getting all data sets”Retrieve all data sets owned by your account to inspect piece counts, CDN status, and metadata:
const const dataSets: EnhancedDataSetInfo[]
dataSets = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.findDataSets(clientAddress?: Address): Promise<EnhancedDataSetInfo[]>
findDataSets();
for (const const ds: EnhancedDataSetInfo
ds of const dataSets: EnhancedDataSetInfo[]
dataSets) { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Dataset ${const ds: EnhancedDataSetInfo
ds.EnhancedDataSetInfo.pdpVerifierDataSetId: bigint
pdpVerifierDataSetId}:`, { live: boolean
live: const ds: EnhancedDataSetInfo
ds.EnhancedDataSetInfo.isLive: boolean
isLive, cdn: boolean
cdn: const ds: EnhancedDataSetInfo
ds.EnhancedDataSetInfo.withCDN: boolean
withCDN, pieces: bigint
pieces: const ds: EnhancedDataSetInfo
ds.EnhancedDataSetInfo.activePieceCount: bigint
activePieceCount, metadata: Record<string, string>
metadata: const ds: EnhancedDataSetInfo
ds.EnhancedDataSetInfo.metadata: Record<string, string>
metadata });}Getting data set pieces
Section titled “Getting data set pieces”List all pieces stored in a specific data set by iterating through the context:
const const context: StorageContext
context = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ StorageServiceOptions.dataSetId?: bigint
dataSetId });
const const pieces: any[]
pieces = [];for await (const const piece: PieceRecord
piece of const context: StorageContext
context.StorageContext.getPieces(options?: { batchSize?: bigint; signal?: AbortSignal;}): AsyncGenerator<PieceRecord>
getPieces()) { const pieces: any[]
pieces.Array<any>.push(...items: any[]): number
Appends new elements to the end of an array, and returns the new length of the array.
push(const piece: PieceRecord
piece);}var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Found ${const pieces: any[]
pieces.Array<any>.length: number
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length} pieces`);Getting data set size
Section titled “Getting data set size”Calculate total storage size by summing piece sizes extracted from PieceCIDs:
const const pdpVerifier: PDPVerifier
pdpVerifier = class PDPVerifier
PDPVerifier.PDPVerifier.create(options?: { transport?: Transport; chain?: Chain;}): PDPVerifier
create();
const const leafCount: bigint
leafCount = await const pdpVerifier: PDPVerifier
pdpVerifier.PDPVerifier.getDataSetLeafCount(dataSetId: bigint): Promise<bigint>
getDataSetLeafCount(const dataSetId: 1n
dataSetId);const const sizeInBytes: bigint
sizeInBytes = const leafCount: bigint
leafCount * 32n; // Each leaf is 32 bytesvar console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Data set size: ${const sizeInBytes: bigint
sizeInBytes} bytes`);Getting a data set piece metadata
Section titled “Getting a data set piece metadata”Access custom metadata attached to individual pieces for organization and filtering:
const const warmStorage: WarmStorageService
warmStorage = class WarmStorageService
WarmStorageService.WarmStorageService.create(options?: { transport?: Transport; chain?: Chain;}): WarmStorageService
create();
const const metadata: MetadataObject
metadata = await const warmStorage: WarmStorageService
warmStorage.WarmStorageService.getPieceMetadata(dataSetId: bigint, pieceId: bigint): Promise<MetadataObject>
getPieceMetadata(const dataSetId: 1n
dataSetId, const piece: { pieceCid: string; pieceId: bigint;}
piece.pieceId: bigint
pieceId);var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Piece metadata:", const metadata: MetadataObject
metadata);Getting the size of a specific piece
Section titled “Getting the size of a specific piece”Calculate size of a specific piece by extracting the size from the PieceCID:
import { import getSizeFromPieceCID
getSizeFromPieceCID } from "@filoz/synapse-sdk/piece";
const const size: any
size = import getSizeFromPieceCID
getSizeFromPieceCID(const pieceCid: string
pieceCid);var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Piece size: ${const size: any
size} bytes`);Storage Information
Section titled “Storage Information”Query service-wide pricing, available providers, and network parameters:
const const info: StorageInfo
info = await const synapse: Synapse
synapse.Synapse.getStorageInfo(): Promise<StorageInfo>
getStorageInfo();var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Price/TiB/month:", const info: StorageInfo
info.StorageInfo.pricing: { noCDN: { perTiBPerMonth: bigint; perTiBPerDay: bigint; perTiBPerEpoch: bigint; }; withCDN: { perTiBPerMonth: bigint; perTiBPerDay: bigint; perTiBPerEpoch: bigint; }; tokenAddress: Address; tokenSymbol: string;}
pricing.noCDN: { perTiBPerMonth: bigint; perTiBPerDay: bigint; perTiBPerEpoch: bigint;}
noCDN.perTiBPerMonth: bigint
perTiBPerMonth);var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Providers:", const info: StorageInfo
info.StorageInfo.providers: PDPProvider[]
providers.Array<PDPProvider>.length: number
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length);
const const providerInfo: PDPProvider
providerInfo = await const synapse: Synapse
synapse.Synapse.getProviderInfo(providerAddress: Address | bigint): Promise<PDPProvider>
getProviderInfo("0x...");var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("PDP URL:", const providerInfo: PDPProvider
providerInfo.PDPProvider.pdp: PDPOffering
pdp.PDPOffering.serviceURL: string
serviceURL);Next Steps
Section titled “Next Steps”Ready to explore more? Here’s your learning path:
-
Advanced Operations → Learn about batch uploads, lifecycle management, and download strategies. For developers building production applications with specific provider requirements.
-
Plan Storage Costs → Calculate your monthly costs and understand funding requirements. Use the quick calculator to estimate costs in under 5 minutes.
-
Payment Management → Manage deposits, approvals, and payment rails. Required before your first upload.