Skip to main content
Skip table of contents

How do I use Piano Analytics data in external tools?

Piano Analytics provides several ways to use your data in external tools (BI platforms, data lakes, custom apps, etc.). Depending on the level of detail you need (aggregated reporting vs. raw event data), the expected volume, and how often you need updates, you can choose between the Reporting API / Data Query exports, Data Flow, or Data Sharing.

1) Reporting API (via Data Query)

The Reporting API lets you extract advanced datasets you have built in Data Query (properties, metrics, segments, time period). It is accessible from any tool that can perform HTTP requests (scripts, Postman, Curl, ETL tools, etc.).

Typical use cases:

  • Feed a BI tool (Power BI, Tableau, Looker, etc.) with aggregated analytics datasets

  • Automate recurring extracts based on Data Query configurations

How it works (high level)

1. Build the dataset in Data Query (properties/metrics/segments/time period).
2. Copy the generated API call from the Data Query interface.
3. Authenticate using an API key (access key / secret key pair) created from your Piano Analytics profile.
4. Execute the call from your external tool.

Key limitations (API v3 / Reporting API)

  • Row limits: 10,000 rows per call, with the ability to go up to 200,000 rows using pagination.

  • Columns limit: up to 50 items in the columns parameter per request.

  • Segmentation limit: up to 6 segments per request.

  • Concurrency limits: up to 5 concurrent calls per user and 20 per organization.

Please refer to our documentation. If your use case exceeds these limits (very large extracts, frequent refresh, or granular/raw event needs), use Data Query advanced exports, Data Flow, or Data Sharing.

2) Data Query exports (file exports)

Data Query exports allow you to export query results as files (for example, to use in Excel, upload to another tool, or ingest into an external pipeline).

Scheduled exports are a practical option when your external tool can ingest files (SFTP/S3) but does not support direct API integration.

Depending on your configuration and contractual options, exports can range from 10,000 rows up to all rows of a query.

Scheduled exports: you can schedule exports (hourly/daily/weekly/monthly depending on your configuration) delivered as a GZIP containing a CSV file.

Common delivery targets include: FTP / sFTP and Amazon S3 (AWS Bucket)

3) Data Flow (optional on the contract)

Data Flow is designed for large-scale, granular, and exhaustive exports—typically to populate a data lake or analytics platform where you combine Piano data with other sources.

Use Data Flow when you need ongoing, high-volume delivery and want full control of transformation/modeling outside Piano Analytics.

How Data Flow works

  • Exports files containing all collected events at a regular frequency: 15 / 30 / 60 minutes

  • Supports export formats such as CSV, JSON, or Parquet

  • Delivery options typically include FTP/sFTP, S3, or Google Cloud Platform, depending on your setup

Important considerations

  • Data Flow exports raw/granular event data. It is not a “reporting” export.

  • Native and custom metrics are not exported in Data Flow. If you need those values, you typically reprocess raw data externally using the same definitions you use in the interface.

4) Data Sharing (optional on the contract)

Data Sharing lets you query your Piano Analytics data directly using SQL in Snowflake, which is the underlying technology used for data storage.

This option is intended for teams that want to work directly at the database level and leverage Snowflake tooling and ecosystem connectors (direct SQL access to raw tables, easy integration with many third-party tools through Snowflake native connectors).

Access modes

  • A Reader Account in the Piano Analytics Snowflake environment, or

  • Secure data sharing between your Snowflake account and Piano’s Snowflake environment (if you have your own Snowflake)

Freshness

Data Sharing is designed for near real-time access, typically with only a short processing delay (often a few minutes).

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.