Cursor Rules TemplatesTemplate

Cursor Rules Template: MQTT Mosquitto IoT Data Ingestion

Cursor Rules Template for MQTT Mosquitto IoT data ingestion with a ready-to-use .cursorrules block and stack-specific guidance for a secure, testable ingestion pipeline.

.cursorrules templatemqtt mosquittocursor rules templatecursor aiiot ingestionsensor datatimescale dbpostgres ingestiondata pipelinesecurity in IoT

Target User

Developers building MQTT Mosquitto IoT data ingestion pipelines

Use Cases

  • Ingest MQTT messages from Mosquitto into a time-series store
  • Validate and transform sensor payloads before storage
  • Enforce TLS mutual authentication and topic-level access controls
  • Audit data lineage from broker to database for compliance

Markdown Template

Cursor Rules Template: MQTT Mosquitto IoT Data Ingestion

Framework Role & Context
You are Cursor AI rules author for a MQTT Mosquitto IoT ingestion stack. Provide concrete, testable guidance for building a secure, observable ingestion path.

Code Style and Style Guides
Python 3.11 preferred; Black for formatting; type hints; docstrings; lint with flake8. Avoid ambiguous language; prefer explicit transformations.

Architecture & Directory Rules
Root: mqtt-mosquitto-iot-ingestion
Src: src
Ingest: src/ingestor
Config: config
Data: data
Tests: tests

Authentication & Security Rules
TLS mutual authentication on broker and clients; use env vars and secret manager for credentials; never hardcode secrets; store certs under config/certs

Database and ORM patterns
Sink: TimescaleDB or PostgreSQL with hypertables; use async DB access when possible; validate payloads before insert; include topics, ts, and payload json fields

Testing & Linting Workflows
pytest and pytest-asyncio for tests; pre-commit with lint checks; CI for unit and integration tests; run ingestion locally with a test broker

Prohibited Actions and Anti-patterns for the AI
Do not bypass TLS; do not embed credentials in code; do not publish from ingestion scripts; avoid local file writes outside approved data paths; do not skip payload validation

Overview

Cursor rules configuration for MQTT Mosquitto IoT data ingestion provides a complete, copyable .cursorrules block to steer Cursor AI in building a secure, production-ready ingestion pipeline. This template covers Mosquitto as the MQTT broker, a Python (or Node.js) consumer, and a TimescaleDB/PostgreSQL sink. It includes architecture and security guidelines, testing workflows, and deployment considerations tailored for IoT workloads.

Direct answer: This Cursor rules template enables you to paste a ready-to-use .cursorrules block into your project and have Cursor AI generate consistent ingestion logic for MQTT Mosquitto IoT data, with explicit constraints and anti-patterns for safe AI-assisted development.

When to Use These Cursor Rules

  • When you need a reproducible ingestion pipeline from Mosquitto topics to a time-series database.
  • When payload schemas vary across sensors and require validation and normalization.
  • When you require guardrails for AI-assisted code generation, especially around security and data correctness.
  • When you want a stack-specific starting point that you can paste into your project root and customize safely.

Copyable .cursorrules Configuration


Framework Role & Context
You are Cursor AI rules author for a MQTT Mosquitto IoT ingestion stack. Provide concrete, testable guidance for building a secure, observable ingestion path.

Code Style and Style Guides
Python 3.11 preferred; Black for formatting; type hints; docstrings; lint with flake8. Avoid ambiguous language; prefer explicit transformations.

Architecture & Directory Rules
Root: mqtt-mosquitto-iot-ingestion
Src: src
Ingest: src/ingestor
Config: config
Data: data
Tests: tests

Authentication & Security Rules
TLS mutual authentication on broker and clients; use env vars and secret manager for credentials; never hardcode secrets; store certs under config/certs

Database and ORM patterns
Sink: TimescaleDB or PostgreSQL with hypertables; use async DB access when possible; validate payloads before insert; include topics, ts, and payload json fields

Testing & Linting Workflows
pytest and pytest-asyncio for tests; pre-commit with lint checks; CI for unit and integration tests; run ingestion locally with a test broker

Prohibited Actions and Anti-patterns for the AI
Do not bypass TLS; do not embed credentials in code; do not publish from ingestion scripts; avoid local file writes outside approved data paths; do not skip payload validation

Recommended Project Structure

mqtt-mosquitto-iot-cursor-rules/
├── ingestor/
│   ├── mqtt_consumer.py
│   ├── processor.py
│   └── __init__.py
├── tests/
│   ├── test_ingestion.py
│   └── conftest.py
├── config/
│   ├── mosquitto.conf
│   └── tls/
│       ├── broker.crt
│       └── broker.key
├── data/
│   └── schema.sql
├── Dockerfile
└── docker-compose.yml

Core Engineering Principles

  • Security by default: TLS, credentials in env, no hardcoded secrets
  • Explicit, testable behavior: deterministic payload validation and idempotent writes
  • Observability: structured logging, tracing, and metrics for ingestion throughput
  • Programming discipline: clear interfaces between broker, processor, and sink
  • Data correctness: schema validation, timestamp handling, and time-series modeling

Code Construction Rules

  • Topic naming: use a consistent, versioned pattern e.g. sensors/{device_id}/ingest
  • Message QoS: prefer QoS 1 for reliability; handle redelivery gracefully
  • Payload handling: validate against a JSON schema; drop invalid payloads with audit logging
  • Transformation: normalize units and timestamps in the processor module
  • Dependency isolation: keep broker logic separate from DB logic
  • Logging: include device_id, topic, ts, and payload hash for traceability
  • Do not hardcode credentials; use environment variables or a secrets manager
  • Do not bypass encryption or skip certificate verification in production

Security and Production Rules

  • Enable TLS on Mosquitto with client certificates; rotate certificates periodically
  • Isolate ingestion services in a private network; limit broker access to local VPC
  • Validate all payloads; reject unknown topics and oversized messages
  • Implement data retention policies and backpressure handling to avoid data loss
  • Keep audit trails for data lineage and access events

Testing Checklist

  • Unit tests for payload validators and transformers
  • Integration tests with a test Mosquitto broker and a test TimescaleDB instance
  • End-to-end tests for a complete ingestion cycle from topic to DB insert
  • Static analysis and linting in CI
  • Security tests including TLS handshake and certificate rotation checks

Common Mistakes to Avoid

  • Ignoring payload validation and assuming all sensor messages are well-formed
  • Storing credentials in code or logs
  • Disabling TLS or certificate checks in non-production environments
  • Skipping tests for message deserialization and DB writes
  • Overloading topics or using non-durable QoS without handling retries

FAQ

What is the MQTT Mosquitto Cursor Rules Template for Cursor AI?

It is a copyable Cursor rules configuration that guides Cursor AI to ingest MQTT Mosquitto messages into a time-series database, with explicit security, testing, and production guidelines for IoT workloads.

How do I paste this into my project root?

Copy the .cursorrules block exactly as shown and place it in your project root, e.g., mqtt-mosquitto-iot-ingestion/.cursorrules, then let Cursor AI validate and tailor it to your environment.

What stack components does this template cover?

The template targets the Mosquitto MQTT broker, a Python (or Node.js) consumer, and a TimescaleDB/PostgreSQL sink, with TLS security and structured data validation.

What are key security considerations?

Use TLS for broker communication, avoid embedding credentials, store secrets in environment or secret managers, rotate certificates, and enforce topic-level access controls.

Can I adapt this to other MQTT brokers?

Yes. The rules template is stack-oriented; you can adapt topic patterns, broker connection logic, and DB integration to other brokers by adjusting the framework context and directory rules.