Important Update
How Datus Viewpoints Works
After creating your Datus Viewpoints account, you'll have the opportunity to join various programs. Before each one begins, we'll clearly outline what data will be collected, how it will be used, and how many points you'll earn for participating. We'll also let you know the number of points needed to receive a payout. Once you reach that threshold, your payment will be sent directly to your Paypal or credited on your payment method.

The Datus Viewpoints app interface showing the welcome screen, available programs, and rewards
Privacy Matters
When you join Datus Viewpoints, we'll ask for some basic information like your name, email address, country of residence, date of birth, and gender. To help match you with the right programs, we may also request additional details such as your location.
Before you start any program, we'll clearly explain how the information you provide will be used.
We do not sell your data to third parties. Any information collected through Datus Viewpoints will never be shared publicly on Datus or any of your other accounts. You're always in control, and you can choose to stop participating at any time.
The Datus Viewpoints app is currently open to individuals aged 18 and older. Each program may have its own set of eligibility requirements. At the moment, Datus Viewpoints is available to U.S. residents with Datus accounts, but we're working on expanding access and adding more registration options in the near future.
1 Decentralised ownership of personal data
Instead of letting 3-party SDKs and pixels deliver your information straight into the surveillance economy, each person (or household) keeps a Personal-Data Vault (PDV):
Scenario 1: How apps and stores help Facebook track your activities
| Component | What it is | Key tech details |
|---|---|---|
| PDV node | Encrypted object store that only the owner's private key can decrypt. Runs on laptop, phone or hosted VPS. |
|
| Metadata ledger | Thin L2 blockchain (e.g., Polygon CDK roll-up) holding pointers to encrypted blobs, licence terms and payment state. |
|
| Oracle cluster | Listens to PDV publish events, verifies licence compliance, scores data quality, triggers mint/burn. |
|
| Burn-meter gateway | Sits in front of commercial endpoints (S3, GCS, Snowflake Share). Returns a signed URL only after burn(n) succeeds. |
|
Result: vaults stay private, but economic rights over access are fungible and enforceable on-chain.
2 De-identification & anonymisation pipeline
Scenario 2: How businesses work with data brokers and Facebook to track you
Before any object leaves a PDV it travels through a privacy filter executed inside the user's vault:
- Schema pruning – Drop direct identifiers (name, e-mail, IMEI).
- Hash-salt – Deterministically hash quasi-identifiers (e.g., postcode + DOB) with user-specific salt, blocking cross-vault linkage.
- k-Anonymity & ℓ-Diversity – Group rows until ≥ k users share the same quasi-ID vector and at least ℓ distinct sensitive-attribute values exist.
- Differential-privacy noise – Add Laplace noise to numeric columns; clip outliers.
- Audit ZK-proof – Produce a proof that steps 1-4 ran on the item with ε≤1.0. The oracle checks this before minting DATOS.
The raw, re-identifiable version never leaves the vault.
3 Main-street distribution: 4-step sprint (2–4 weeks)
| Sprint step | Engineering artefact | Market integration |
|---|---|---|
| 1 Normalise & bucketise | Spark/Flink writes Parquet v2 (snappy, INT96 timestamps) partitioned dt=YYYY-MM-DD/HH/country so cloud engines push-down and cold-scan cheaply. | |
| 2 Stage to object store | • AWS S3 → arn:aws:s3:::<product>-analytics-prod (Object-Lock + SSE-KMS) • GCP GCS → gs://<product>-analytics-prod | |
| 3 Expose query & file modes | • Snowflake Marketplace – Provider Studio ▸ Create Listing; consumer can SELECT * FROM <db>.panel with zero ETL • AWS Data Exchange – Create Data Set ▸ Add Revision ▸ Publish Product | |
| 4 Token-meter wrapper | 30-line gateway issues presigned URL only after BurnMeter.consume() on-chain succeeds; returns HTTP 302. |
Snowflake + ADX cover ≈ 70 % of enterprise data spend; BigQuery Analytics Hub or Databricks Clean-Rooms can be added later.
4 Economic example — vault owners vs. today's broker model
Scenario
- 100 k users stream location pings + card receipts (≈ 500 MB/user/day).
- Oracle quality-score mints 1 DATOS ≈ 1 MB.
- AI lab buys 1 PB of de-identified data (training self-supervised retail-demand model). Market price (from recent private deals) ≈ $1 M/PB.
DATOS flow
| Step | Tokens | Fiat |
|---|---|---|
| Mint | 1 PB × 1 DATOS/MB = 1 B DATOS to users | – |
| Burn on purchase | AI lab burns 1 B DATOS | Pays $1 M equivalent to DEX ↔ DATOS swap |
| Redistribution | Treasury pro-ratas new supply → each user gets 10 k DATOS (= $10) | Users see wallet balance or auto-swap to USDC |
Compare with broker world
- Data broker CPM equivalent ≈ $0.50 per 1 k device-days.
- Same 100 k users → broker revenue $1.5 k/day; individuals receive $0.
- Broker market > $270 B (2024) and growing 7 % CAGR.
Decentralised ownership captures nearly 70 % of gross sale for creators (after 15 % DEX + infra fees).
5 Where big-tech money comes from today
| Actor type | Revenue model | Role of personal data |
|---|---|---|
| Advertising networks (Meta, Google) | Auction CPM/CPC ads; ~ $0.01 per ad impression. | User attributes train look-alike models; buyers can upload hashed CRM lists (see Facebook Custom-Audiences). Average volunteer in CR study was targeted by 2 230 companies; one hit 48 k companies; 186 892 companies total. |
| AI providers (OpenAI, Anthropic) | Pay once for training data; monetise via API usage. | Private licensing boom: Photobucket negotiating $0.05–$1/photo; Shutterstock deals $25–50 M each. |
| Cloud platforms (AWS, Azure, GCP) | Charge storage ($0.023/GB-mo), egress, marketplace rev-share (20 %). | Host broker data sets; offer clean-room compute that never leaves VPC. |
| Social media | Ad revenue + paid APIs; upsell "firehose" (e.g., X "enterprise tier" $42 k/mo). | First-party insight; sometimes resell historical archives to AI labs. |
6 Take-away for builders
By merging on-chain licence tokens with cloud-native delivery channels, you meet enterprise buyers where they already shop while guaranteeing that only de-identified, consented data crosses the gate—and that the economic upside flows back to the very people who generated it.