Data Platforms • Backend • Cloud

Elena Manole

Big Data Engineer | Platform Engineer

I design and run reliable data platforms for analytics and product use cases. Most of my work sits at the intersection of backend services, streaming pipelines, and cloud infrastructure.

About

Big Data and Streaming Engineer with 5+ years of experience building large-scale data systems. I focus on robust ingestion, incremental processing, and production-grade orchestration with tools such as Spark, Kafka, Airflow, and AWS.

Core Strengths

Data Engineering

Streaming and batch pipelines, incremental models, data quality, and lineage-aware processing.

Backend Systems

Python/Go services, APIs, event-driven patterns, fault-tolerant jobs, and production observability.

Cloud & Platform

AWS-native architectures with orchestration, CI/CD integration, and infrastructure-minded system design.

Experience Highlights

Big Data Engineer · SearchPilot Via Remote.com

Aug 2023 — Present
  • Own core platform pipelines for analytics and experimentation.
  • Built Firehose ingestion and Googlebot/customer-event streams.
  • Tuned streaming and batch workflows for production stability.

Senior Data Engineer · Audience Data @ BBC

Feb 2022 — Aug 2023
  • Built and operated large-scale audience data pipelines.
  • Processed data with Apache Hudi and event-time windowing.
  • Automated interactions-attribution models for all sites.

Software Engineer @ BBC

Nov 2020 — Feb 2022
  • Implemented CI/CD and deployment security improvements.
  • Built event processing and internal tooling for data products.

Selected Side Projects

MT5 Multi-Timeframe Trading Signal Bot

Python · Data signals · Risk controls

Backend-oriented trading signal engine that consumes OHLC market data across multiple timeframes and outputs explainable BUY/SELL/NO_TRADE decisions with SL/TP levels.

  • Combined ADX, RSI, MACD, VWAP, and Fibonacci logic in a confidence-scored strategy.
  • Added periodic exporter for live notifications and reporting.
  • Improved signal stability with closed-bar entry logic and robust ATR-based stop-loss design.

Public Auctions Platform

Backend APIs · Data workflows · Reliability

End-to-end auction platform side project focused on backend architecture and data lifecycle: listings, bids, event handling, and reporting.

  • Designed data models and API flows for auction entities and bid state transitions.
  • Emphasized consistent validation, traceability, and predictable outcomes under concurrent updates.
  • Added analytics-ready exports to support operational reporting.

Tools & Technologies

Languages

Python Go Scala Java SQL Bash

Data & Streaming

Apache Spark Kafka Apache Hudi Apache Beam Airflow / MWAA

AWS Data Platform

EMR Glue Lambda Athena Redshift S3

Datastores & Delivery

PostgreSQL MongoDB Docker Jenkins

My Medallion Architecture

This is the Medallion architecture pattern I use in my own work to turn raw events into trusted, analytics-ready datasets and production-facing outputs.

Bronze

Raw events from APIs, trackers, and logs.

Silver

Validation, enrichment, dedupe, and session logic.

Gold

Analytics-ready models powering decisions.

Serving

Dashboards, APIs, and product features.

01 Capture reality

Reliable ingestion first. No assumptions, no dropped edge cases.

02 Make data trustworthy

Schema checks, late-arrival handling, and deterministic transforms.

03 Model for action

Domain-ready datasets for experiments, forecasting, and KPIs.

04 Deliver to users

Low-latency access paths that teams can depend on every day.

Let’s Build Reliable Data Systems

I am open to data platform, backend, and cloud engineering roles.

Contact Me