Fabric RTI 101: Designing Eventstream Pipelines
When it comes to designing an Eventstream pipeline in Fabric, the process generally follows a clear, three-step flow. First, you start with the inputs — the data sources. This might include Kafka, Event Hubs, IoT Hub, or other streaming systems. At this stage, you define the connections and schemas so Fabric knows how to interpret incoming events.
The second step is where you apply transformations. These are the operations that make raw data more usable and more valuable. You might apply filtering to reduce noise and drop irrelevant events. You might use mapping to rename fields, adjust types, or flatten JSON into something cleaner. And you might use routing to branch different types of events to different destinations. Together, these transformations ensure that events are shaped, cleaned, and directed properly before they move downstream.
2026-05-05

