Saltar al contenido principal

Tech Deployments

Senior AI Systems Engineer & Technical Project Lead specializing in production-grade AI and data infrastructures — from ingestion and orchestration to analytics, deployment, and automation at scale.


🧩 Data Infrastructure

  1. Built End-to-End Data Orchestration System – modular scheduler overseeing 100+ ETL and ML workflows with CI-style logging, monitoring, and failure recovery.
  2. Engineered Textflow ETL & Memory Architecture – JSONL/Parquet pipelines feeding FAISS-based retrieval layers; includes data validation, lineage tracking, and summarization.
  3. Deployed ChromaDB Knowledge Store – persistent embedding system with automated retraining, query APIs, and metadata versioning.
  4. Integrated Multimodal Data Stack – unified ingestion of audio, video, OCR, and structured text into reproducible, schema-validated datasets.
  5. Automated Financial Data Framework – OCR and parsing engine reconciling invoices and ledgers with full audit trails and analytic dashboards.
M.I. Journal thumbnail
Deployment: M.I. Journal
Quartz knowledge hub with tags, monthly logs, and semantic navigation.

🧠 Machine-Learning & Knowledge Systems

  1. Designed Entity & Semantic Graph Engine – NER→knowledge-graph pipelines powering cross-document analytics and search.
  2. Implemented Repository Intelligence Toolkit – CI scripts and GitHub-API analytics for dependency audits, code-health metrics, and workflow telemetry.
  3. Built Cloud-Linked Analytics Environments – integrated BigQuery and GCP storage for socio-economic modeling; automated data pulls and transformations.

⚙️ Automation & Governance

  1. Developed Static-Site & API Deployment Pipelines – continuous Markdown-to-web builds and JSON API endpoints for analytical outputs.
  2. Architected AI Consulting & Automation Framework – productized internal automation stack into modular micro-services with paywall logic and dashboards.
  3. Maintains Control-Tower Documentation Layer – ADR and runbook system that records architecture decisions and operational lifecycles.

🤝 Collaboration & Learning Tools

  1. Created Cross-Team Collaboration Layer – CRM + CMS systems turning datasets and research outputs into accessible dashboards for non-technical partners.
  2. Auto-Generated Pedagogical Code Materials – notebooks and examples derived from live repositories, ensuring reproducible teaching and documentation.
LDD-NBS course site thumbnail
Deployment: LDD-NBS Course Exercises
Notebook-based exercises for Data Lab at UBA, built for reproducible learning.
  1. Deployed Question-Answering Systems – LLM-based app for internal knowledge retrieval and teaching assistance.
Evaluar App thumbnail
Deployment: Evaluar App
AI-assisted exercise distribution and feedback for departmental courses.

Core competencies: Python, SQL, ChromaDB, FAISS, Docker, Airflow-style orchestration, BigQuery, GitHub Actions, OCR pipelines, and data governance design.

Focus areas: automation reliability, semantic data architecture, and reproducible analytical ecosystems.