GDPR and Journal Apps: Your Privacy Rights | Conviction

Your journal entries are health data under GDPR. Most apps don't treat them that way. Learn what Art. 9 requires and how to check if your journal app complies.

You wrote about your anxiety on Tuesday. Your fear of failure on Thursday. Your relationship doubts on Sunday. Under European law, every word of that is health data. It falls under the same GDPR protections as your medical records.

Does your GDPR journal app treat it that way?

Most don’t. Most journaling apps treat your entries as regular personal data, the same category as your email address or display name. But GDPR Article 9 draws a hard line between personal data and special category data. Journal entries containing mood tracking, emotional reflections, and mental health content fall on the special category side. That changes everything about how your app should handle them.

This resource explains what GDPR actually requires for journal apps, what rights you have as a user, and how to evaluate whether your GDPR journal app takes your mental health privacy seriously.

Why Journal Entries Are Health Data Under GDPR

What GDPR Considers “Health Data”

Under GDPR Article 9, health data is any data that reveals a person’s physical or mental health status. Journal entries that contain mood ratings, emotional reflections, anxiety descriptions, therapy notes, or mental health observations qualify as special category data under this definition. They require explicit consent and enhanced technical protections before any processing can occur.

This isn’t a technicality. A peer-reviewed study published in the Journal of Law and the Biosciences established that self-reported mental health data, including mood logs and emotional reflections, meets the GDPR definition of health data. The legal analysis from Taylor Wessing, a leading European law firm, confirms the same conclusion for digital health apps: data collected by mental health applications is health data, and explicit consent under Art. 9(2)(a) is required.

Why This Matters for Your Journal App

The distinction between regular personal data and special category health data triggers a cascade of legal requirements.

Explicit consent, not just a privacy policy. Your app needs to ask you specifically for permission to process health data. A general “I agree to the privacy policy” checkbox is not enough under Art. 9. You should see a separate, clearly worded consent request that names health data processing.

Enhanced technical protections. Encryption is not optional for health data. Your app should use recognized standards like AES-256 encryption at rest, not vague promises about “industry-standard security.”

Stricter rules on data transfer. If your app processes entries on cloud servers, those transfers must comply with GDPR data transfer requirements. If AI analyzes your entries on external infrastructure, that counts as processing your health data on third-party servers.

Most journaling apps skip these requirements entirely. They store your mood data, emotional reflections, and therapy notes alongside your email address, treating everything as generic personal data. Under GDPR, that’s a compliance gap.

Your GDPR Journal App Rights: What the Law Guarantees

GDPR gives you specific, enforceable rights over your journal data. These aren’t suggestions. They’re legal requirements your app must honor within 30 days of your request.

Right to Know (Art. 13/14)

Your journal app must tell you what data it collects, where that data goes, and who processes it. This includes disclosing whether AI models analyze your entries, whether data travels to cloud servers, and whether any third-party SDKs have access to your content.

Right to Access (Art. 15)

You can request a complete copy of all your data in a readable format. Your GDPR journal app must respond within 30 days.

Right to Data Portability (Art. 20)

You have the right to receive your data in a structured, machine-readable format like JSON or CSV. This means you can export all your journal entries, including metadata, and transfer them to another service. No vendor lock-in. No proprietary formats that trap your writing.

If your GDPR journal app doesn’t offer data export in a standard format, it’s not meeting Art. 20 requirements for data portability.

Right to Erasure (Art. 17)

You can request deletion of all your data. For a journal app, “complete deletion” means removing your entries from local storage, cloud servers, backups, analytics systems, and any derived data like AI-generated embeddings or summaries.

If your app only deletes entries from the visible interface but keeps copies in cloud backups or analytics databases, that’s not erasure. That’s hiding.

Right to Restrict Processing (Art. 18)

You can pause all processing of your data while a dispute is resolved. Your data remains stored but cannot be analyzed, processed, or used in any way. Read-only access for your own data portability must still be preserved.

You can withdraw consent at any time, and withdrawal must be as easy as granting consent was. If your app required a single tap to grant consent but demands a support email to withdraw it, that violates Art. 7(3).

The practical test: open your journal app’s settings right now. Can you find a way to withdraw consent for health data processing? If you can’t find it within 30 seconds, your app has a problem.

Your data, your rights. These protections exist because journal entries containing emotional and psychological content are among the most sensitive data you produce. Want to understand what makes a journal app truly private beyond GDPR? Read our guide to what makes a journal app truly private.

What Article 9 Means for AI in Your GDPR Journal App

When AI Processes Your Health Data

AI analysis of journal entries is processing special category data. Full stop.

When an AI reads your entry about anxiety and generates insights, it’s processing health data. That processing requires explicit consent, specifically for AI analysis, not just general consent for storing entries.

Cloud AI processing adds a layer of complexity. Your entries leave your device and travel to external servers where they’re analyzed by models running on third-party infrastructure. Under GDPR, this creates data transfer obligations, processor agreements, and additional consent requirements.

On-device AI processing avoids these issues entirely. When the AI model runs on your phone, your entries never leave your device. No data transfer occurs. No third-party processor touches your health data. For any GDPR journal app using AI, on-device processing is the simplest path to Art. 9 compliance.

Learn more about on-device AI journaling and why processing location matters for your privacy.

The EU AI Act and Journal Apps

The EU AI Act, effective in 2026, introduces additional requirements for AI systems processing health data. AI that analyzes mental health content could face high-risk classification, which triggers transparency requirements, human oversight obligations, and mandatory risk management.

For journal apps using cloud AI, this means another layer of compliance obligations on top of GDPR Art. 9. For apps using on-device AI, the compliance surface is smaller because your data never enters a cloud AI pipeline.

Data Protection Impact Assessment

GDPR Art. 35 requires a Data Protection Impact Assessment (DPIA) when processing health data at scale. Journal apps with AI analysis should conduct DPIAs that evaluate necessity, proportionality, risks, and safeguards.

A DPIA is not a checkbox exercise. It’s a structured evaluation of whether the processing is justified and whether the safeguards are adequate. If your journal app processes health data with AI and has never conducted a DPIA, it may not be meeting its Art. 35 obligations.

How to Evaluate a GDPR Journal App for Compliance

What Should You Look for in a GDPR Compliant Journal App?

The following checklist covers the minimum requirements for GDPR-compliant handling of your mental health data. Use it to evaluate any journal app before trusting it with your entries.

  1. Explicit health data consent. Does the app ask for separate, specific consent for health data processing? Not just a general privacy policy agreement, but a distinct consent request naming Art. 9 health data.

  2. Named encryption standard. Is your data encrypted at rest? What standard? Look for AES-256 or equivalent. “Industry-standard security” without naming the standard is a red flag.

  3. Clear data location disclosure. Where is your data stored? On your device? On cloud servers? In which country? Your app should tell you this clearly.

  4. AI processing location. If the app uses AI, does it process your entries locally on your device or send them to cloud servers? Cloud AI processing of health data requires additional consent and safeguards.

  5. Data export in standard format. Can you export all your data in JSON, CSV, or another structured, machine-readable format? This is your Art. 20 right.

  6. Complete account deletion. Can you delete everything? Does the app specify exactly what gets deleted, including local data, cloud data, backups, and analytics?

  7. No third-party tracking SDKs. Does the app integrate advertising or analytics SDKs that process your data? A German security audit by Kuketz IT-Security found advertising networks like MoPub and Amazon Advertising integrated into popular diary apps, collecting unique device identifiers from users writing about their most private thoughts.

  8. Privacy policy mentions health data. Does the privacy policy specifically address Art. 9 health data, or does it only cover general data processing?

  9. Processing restriction available. Can you restrict processing of your data under Art. 18?

  10. Easy consent withdrawal. Can you withdraw consent as easily as you granted it? One tap to grant should mean one tap to withdraw.

Conviction checks all 10. Explicit Art. 9 consent flow. Named AES-256 encryption. On-device AI, so no journal content leaves your phone. Full JSON export. Complete account deletion with audit report. Review Conviction’s privacy policy or try it free for 30 days.

Red Flags in Journal Apps

Watch for these warning signs when evaluating your GDPR journal app.

  • Privacy policy that never mentions Art. 9 or health data. If your app stores mood ratings and emotional reflections but its privacy policy only discusses “personal data,” it’s treating health data as regular data.

  • “Industry-standard security” without naming the encryption. This phrase means nothing without specifics. AES-256, SQLCipher, end-to-end encryption: these are verifiable claims. “Industry-standard” is not.

  • Cloud AI processing without separate consent. If AI analyzes your entries on cloud servers and the app never asked you specifically about AI processing of your health data, that’s a consent gap.

  • No data export feature. Art. 20 requires it. If you can’t export your journal entries in a machine-readable format, your app is not meeting its data portability obligations.

  • Advertising SDKs in a journaling app. Your entries are health data. Advertising networks have no business anywhere near them. If your app shows ads or integrates ad tracking SDKs, your health data may be accessible to third-party advertisers.

  • Deletion that only removes visible entries. Ask your app provider what deletion covers. Does it include cloud backups? Analytics data? AI-generated embeddings? Incomplete deletion violates Art. 17.

How Conviction Works as a GDPR Journal App

Conviction is a GDPR journal app that treats your entries as what they are under the law: Art. 9 health data requiring the highest level of protection. You can review the specifics in Conviction’s privacy policy. Here is how that principle translates into the product.

Conviction uses separate consent categories with timestamps for audit trail accountability. Before any journal entries are stored, you grant explicit consent for health data processing. This is distinct from your privacy policy agreement, your cloud sync consent, your biometric authentication consent, and your analytics consent. Each consent is recorded individually with a timestamp, meeting Art. 7 accountability requirements.

SQLCipher AES-256 Encryption

All journal data is encrypted at rest with SQLCipher using AES-256 encryption. This is not optional. Health data requires encryption, so encryption is mandatory in Conviction. The encryption key is stored in Secure Enclave, and biometric authentication (Face ID) adds a second layer of access protection.

On-Device AI Processing

All AI analysis, every insight, theme detection, emotion analysis, and memory-based response, runs locally on your device via Apple Intelligence. No journal content ever leaves your phone for AI processing.

This eliminates GDPR data transfer concerns entirely. There are no third-party AI processors, no cross-border data transfers, and no processor agreements needed for your journal content. Conviction uses no cloud storage for journal data. Your entries stay on your device. Verify it yourself by running a network inspector while you use Conviction. You’ll find zero calls carrying journal content.

Data Portability (Art. 20)

Conviction offers full JSON export of all entries with metadata. An encrypted export option (AES-256-GCM) is also available. No vendor lock-in. Export everything you’ve written at any time, in a structured, machine-readable format. Leave if you want.

Right to Erasure (Art. 17)

Account deletion in Conviction is comprehensive. It covers your local SQLite database including all tables, your secure credential storage including encryption keys and biometric data, your app state and preferences, your iCloud backups if enabled, and your authentication session. iCloud backups use a zero-knowledge backup architecture: your data is encrypted with your passphrase before it leaves your device, so neither Apple nor Conviction can read it. A detailed deletion report shows exactly what was removed.

Processing Restriction (Art. 18)

Users can restrict all processing of their health data with a single action. Read-only access is preserved for data portability. Deletion requests are still honored during restriction. Your data stays frozen until you decide what to do next.

Zero Cloud Processing of Journal Content

No third-party SDKs process journal data. No advertising networks. No analytics on journal content, only metadata like session duration and feature usage. RAG-based memory, the vector embeddings that let the AI reference your past entries, is stored locally on your device. Even the consistency system uses Momentum instead of streaks, because data minimization applies to behavioral nudges too.

This is verifiable. Run a network inspector. Inspect the traffic. Conviction sends nothing.

GDPR Rights Comparison Table

GDPR RightArticleWhat It Means for Journal UsersHow Conviction Implements It
Right to KnowArt. 13/14Know what data is collected and where it goesTransparent consent flow, no cloud processing of content
Right to AccessArt. 15Get a copy of all your dataJSON export with full metadata
Right to PortabilityArt. 20Receive data in standard format, transfer anywhereJSON and encrypted export, no lock-in
Right to ErasureArt. 17Delete all your data completelyComplete deletion across all storage layers with report
Right to RestrictArt. 18Pause processing, keep read accessOne-action restriction, read-only preserved
Right to Withdraw ConsentArt. 7(3)Withdraw as easily as you grantedIndividual consent categories, reversible

Key Takeaways

Your journal entries are not regular personal data. If they contain mood tracking, emotional reflections, or mental health content, they’re health data under GDPR Art. 9. That classification triggers specific legal requirements that most GDPR journal app providers don’t meet.

Here’s what to remember:

  • Journal entries with mood and emotional content are Art. 9 health data. They require explicit consent and enhanced technical protections.

  • You have the right to export, delete, and restrict your data. These are legal rights, not app features your provider can choose to skip.

  • AI processing of journal entries requires explicit consent. General privacy policy agreements aren’t enough for AI analysis of health data.

  • On-device AI eliminates data transfer concerns. When processing stays local, your health data never enters a cloud pipeline.

  • Use the 10-point checklist. Before trusting any app with your mental health data, verify that it meets the minimum GDPR requirements.

Shadow work entries, CBT reflections including thought records, mood logs, voice journal entries, and emotional processing all qualify as health data. The more vulnerable the content, the more your GDPR rights matter. This is especially true for therapeutic journal content like shadow work journaling, where what you write is often the most private material you’ll ever put into words.

Your entries are health data. Conviction treats them that way. On your device. Encrypted. Yours. Try Conviction free for 30 days. No credit card required.

This article is for informational purposes only and is not a substitute for professional mental health treatment. If you are experiencing significant distress, please consult a licensed therapist or counselor.