Legal
Security
Last updated: 16 April 2026
Protecting customer data is a founding commitment. This page describes the security practices currently in place at Akaro AI. We are a small team, and we are transparent about where we are today and where we are heading. If you have a security question or want to report a vulnerability, email [email protected].
Architecture
- Edge. Cloudflare sits in front of akaro.ai, www.akaro.ai, and platform.akaro.ai, providing TLS termination, DDoS mitigation, a web application firewall, and a global CDN. The chat API at api.platform.akaro.ai is served directly from the origin (bypassing Cloudflare) to preserve long-lived server-sent event streams.
- Hosting. The platform runs on a dedicated virtual server at Hostinger in the United States (Boston, Massachusetts). Caddy serves as the origin TLS proxy and routes traffic to the application tier.
- Operational database. MongoDB Atlas, managed by MongoDB, Inc., stores account records, organisation metadata, project and document metadata, chat history, activity logs, and encrypted connector tokens.
- Vector store. ChromaDB runs on the application server and stores embeddings used to retrieve relevant knowledge-base content for AI queries.
- AI processing. OpenAI’s API is used for embeddings and completions. Requests are transmitted over TLS. OpenAI’s API terms state that data submitted via the API is not used to train OpenAI’s models by default.
- Email. Transactional email is delivered via Google Workspace SMTP.
A complete list of third-party services with access to customer data is at /legal/subprocessors.
Encryption
- In transit. All traffic between clients, the platform, and sub-processor APIs is encrypted with TLS 1.2 or higher. TLS certificates are issued by Let’s Encrypt and renewed automatically by Caddy.
- At rest (operational database). MongoDB Atlas applies AES-256 encryption at rest to the operational database by default.
- At rest (vector store and uploaded files). Stored on the encrypted file system of the application server.
- Secrets and connector tokens. OAuth access and refresh tokens for third-party connectors are encrypted at the application layer using Fernet (AES-128-CBC with HMAC-SHA256) before being written to the database.
- Passwords. User passwords are stored only as bcrypt hashes. We never see or store plaintext passwords.
Authentication and access
- End users. Sign in with email and password, or Sign in with Google. Sessions are managed via short-lived JWT access tokens (15 minutes) and rotating refresh tokens (7 days). Two-factor authentication (TOTP) is available and recommended for all admins.
- Role-based access. Within an organisation, users are assigned roles (owner, admin, member, browser) that control what they can see and do.
- Internal access. Akaro personnel access production systems only on a need-to-know basis, through SSH with key-based authentication. Production credentials are stored in a secrets manager and rotated when personnel change.
- Admin activity. Privileged administrative actions (impersonation, password reset on behalf of a user, resending verification) are logged.
Application security
- Rate-limiting on authentication and chat endpoints to mitigate abuse.
- Input validation via Pydantic schemas on the backend.
- Hardened content rendering in the web client; avoidance of
dangerouslySetInnerHTMLexcept for trusted content. - Dependencies tracked through the package manifests of both repositories; security advisories reviewed through GitHub’s Dependabot.
- Source code is hosted on GitHub in private repositories with branch protection on the
mainbranch.
Resilience
- Database backups. MongoDB Atlas provides automated continuous backups with point-in-time restore. Retention follows the Atlas plan in use.
- Uptime. The platform is a single-region deployment today. Service availability is monitored through uptime checks against the
/healthendpoint. - Disaster recovery. We are building out a documented disaster-recovery runbook and regular restore drills; for now, the combination of managed database backups and the reproducible application image is our baseline.
Privacy by design
- Customer content is scoped to a single organisation. Search and retrieval are always filtered to the requesting organisation.
- Deletion of a document in the product propagates to the associated vector embeddings.
- Connectors import only the content the user authorises; disconnecting a connector stops further imports.
- Customer content is not used to train shared models. See our Privacy Policy and DPA.
Chrome extension and Google API use
Our Chrome browser extension uses Google Workspace APIs (Sheets, Docs, Slides, and — on user action — Gmail) to assist with RFPs, questionnaires, drafts, and replies. We adhere to the Google API Services User Data Policy, including the Limited Use requirements: data accessed through Google APIs is used only to provide the user-facing feature the user invoked, is not transferred to others except as necessary to provide that feature, is not used for advertising, and is not read by humans except with the user’s explicit consent or as required by law.
Compliance roadmap
We are committed to being transparent about compliance. Our current status:
- GDPR / UK GDPR: addressed by our Privacy Policy and DPA, with Standard Contractual Clauses incorporated for international transfers.
- India DPDP Act 2023: addressed by our Privacy Policy; a Data Protection Officer is designated.
- CASA (Google Cloud Application Security Assessment): in progress — required for the Chrome extension’s Google Workspace restricted scopes.
- SOC 2 Type II: not yet achieved. We are preparing policies and evidence collection with a view to an initial report in the next 9–12 months.
- ISO 27001: not currently pursued.
- Independent penetration testing: not yet performed. Planned before we process restricted categories of customer data or before a SOC 2 audit.
- HIPAA: the Services are not designed for the processing of protected health information. Do not upload PHI.
We will update this page as each item advances.
Responsible disclosure
If you discover a security vulnerability, please report it to [email protected] with “Security Disclosure” in the subject. We will acknowledge receipt within 2 business days and work to validate and remediate valid issues promptly. Please do not publicly disclose the vulnerability until we have had a reasonable opportunity to address it. We do not currently operate a paid bug-bounty programme but are happy to publicly credit researchers who follow responsible disclosure.
Machine-readable disclosure details are published at /.well-known/security.txt in line with RFC 9116.
Contact
Security: [email protected]
Data Protection Officer: [email protected]
Privacy questions: [email protected]