About Me
Data without strategy is just storage cost. My 20+ year journey in IT has taught me that true data engineering begins with business understanding, not just code, and that data infrastructures must be invisible so that intelligence can take center stage.
Today, my work converges the technical rigor of Data Engineering with the consultative vision of Strategy and Governance. My mission is to bridge the gap between complex infrastructures and high-impact decisions, ensuring that data generates real value.
What I deliver as Consultant and Specialist:
- Scalable Data Architectures: End-to-end Cloud projects (AWS, Azure, GCP), focused on Data Lakes, Data Warehouses, and Databricks.
- Technical Leadership and Governance: Structuring processes for quality, security, and data democratization, raising analytical maturity.
- Innovation with AI & Analytics: Implementing Machine Learning and BI solutions that transform massive data volumes into predictive intelligence.
My academic foundation in Data Science, Mathematics, and Computer Science, with a specialization in Artificial Intelligence, combined with an MBA in Business Intelligence & Analytics, allows me to move seamlessly between the depth of code and executive language.
Maturity and Creativity
I believe that technology, much like music—my great passion and hobby—requires harmony among different elements to function. I bring to projects the discipline of engineering and the creative sensitivity to solve complex problems in an innovative and collaborative way.
Thank you for the opportunity to present my professional profile. I'm open to conversations and collaborations, whether about technology or music, I'm always eager to connect!
Education
- MBA in Business Intelligence, Management & Analytics
São Judas Tadeu University - Postgraduate in Artificial Intelligence
São Judas Tadeu University - Degree in Technology, Data Science
Uninove University - Bachelor's Degree in Mathematics and Computer Science
Mackenzie Presbyterian University
Professional Experience
Strategic role as Technical Reference and Consultant in one of the largest data environments in the financial sector, leading end-to-end solutions that transform complex information flows into business assets.
As guardian of the architecture, my responsibility covers the entire data lifecycle, from raw ingestion to strategic availability, ensuring scalability and governance.
Key Impact Areas:
- End-to-End Architecture & Cloud: Technical leadership in building and optimizing pipelines (ETL/ELT) using Azure Databricks, Delta Live Tables, and Azure Data Lake Storage, ensuring high performance in processing large volumes (Big Data).
- Data Mesh & Governance: Implementation of decentralized architecture (Data Mesh), taking ownership of domains and ensuring each data product meets strict standards of quality, security, and documentation.
- Efficiency and Performance: Critical optimization of Spark/PySpark jobs and complex queries, reducing latency and computational costs in mission-critical environments.
- Leadership & Mentoring: Acted as technical mentor for data analysts and scientists, disseminating software engineering standards and strengthening the data-driven culture.
- Strategic Partnership (Stakeholders): Translated business requirements into high-impact data solutions, acting as the link between executive needs and technical feasibility.
Acting as a Specialist Data Consultant, delivering end-to-end solutions that cover everything from architecture design to technical implementation. Expertise in modernizing legacy structures into high-performance cloud environments, transforming data ecosystems into strategic assets across diverse business segments.
My approach combines consultative diagnosis with technical execution to ensure that companies of all sizes have robust infrastructures. The central focus is process automation, strict governance (LGPD/GDPR), and maximum financial efficiency, delivering scalable environments with optimized cloud costs.
Key Delivery Fronts:
- End-to-End Data Engineering: Building robust pipelines in Python and SQL to integrate heterogeneous sources (APIs, relational/NoSQL databases, JSON, Parquet), ensuring continuous data flow from origin to consumption.
- Cloud Performance and Cost Optimization: Diagnosis and restructuring of cloud environments to achieve technical efficiency and reduce operational expenses, through fine-tuning resources and refactoring complex queries.
- Big Data Modeling and Architecture: Designing optimized data structures for scale, implementing engineering best practices such as version control (Git/GitHub), automated testing, and rigorous technical documentation.
- Governance and Compliance (LGPD/GDPR): Implementing security layers and privacy policies, ensuring technological innovation remains fully compliant with current regulations.
- Data Migration and Modernization: Technical leadership in cloud migration projects, ensuring data integrity and business continuity during technological transitions.
During this period, I took a strategic career break focused on technical and personal development.
I dedicated myself intensively to studying new technologies, especially in the fields of Machine Learning and Artificial Intelligence, through courses and complementary training.
In addition to technical skills, I experienced multicultural environments in different countries, which broadened my global perspective, strengthened my adaptability, and brought new inspiration to my professional approach.
It was also a moment of reconnection with personal values, strengthening family bonds, and developing behavioral skills such as resilience, focus, and emotional balance.
I return to the market with renewed energy, a clear purpose, and a solid knowledge base, ready to contribute with innovative solutions and act with authenticity, creativity, and results-driven focus.
Main responsibilities:
- Data Pipelines (ETL/ELT): Design, develop, and automate pipelines for data collection, transformation, and loading from multiple sources (SQL, Oracle, Teradata, APIs, files, and streams) using HiveSQL and Python.
- Scalable Architectures: Build and maintain Data Lakes, Data Warehouses, and distributed systems with a focus on performance, security, and governance.
- Data Quality and Availability: Ensure integrity and reliability through continuous monitoring and automated validations.
- Cloud and Big Data Integration: Integrate corporate data into cloud environments and Big Data tools (Apache Spark, Hadoop, Hive, Databricks).
- Collaboration with Data Science and BI: Provide clean, accessible, and analysis-ready data for predictive modeling.
- Documentation and Compliance: Document flows, models, and metadata according to governance and compliance standards (LGPD, GDPR).
- Performance and Scalability: Enhance system performance and scalability by applying engineering best practices and process automation.
Contributions and Results Achieved:
- Analytical Bases and KPIs: Creation and management of analytical bases and KPIs to support AI strategies in Telefônica's Collections area, increasing operational efficiency and accuracy of actions.
- SAP-Azure Databricks Integration: Ingestion and transformation of SAP ECC data into Heineken's Azure Databricks environment, ensuring reliability and standardization.
- Predictive Models: Development of Python models for customer churn prediction, achieving 85% accuracy and contributing to retention improvement and loss reduction.
Skills, Tools and Technologies:
- SQL · Python · Hive · Apache Spark · Hadoop · Teradata · Oracle · Databricks · AWS · Azure · GCP · Linux · Data Lake · ETL/ELT · Data Governance · Data Modeling · Storytelling com Dados
Main responsibilities:
- Diagnosis and Requirements: Identification of information needs, functional and technical requirements, with preparation of documentation and detailed specifications.
- Architecture and Integration: Design of data architectures, definition of pipelines (ETL/ELT), and integration of multiple sources (ERP, CRM, spreadsheets, APIs).
- Analytical Bases and Data Warehouse: Modeling, structuring, and development of analytical bases, data marts, and Data Warehouse solutions for dashboards and executive reports.
- KPIs, Indicators, and ROI: Mapping and definition of strategic KPIs/OKRs, performance monitoring, and analysis of operational metrics, with a focus on continuous optimization.
- Data Governance and Quality: Consulting in governance, regulatory compliance (LGPD/GDPR), and assurance of data quality, integrity, and security.
- Leadership and BI Projects: Technical coordination of BI projects, team mentoring, and promotion of best practices in modeling, automation, and operational efficiency.
- Strategic Support and Optimization: Identification of trends and opportunities for process improvement and support for data-driven decision-making.
Contributions and Results Achieved:
- Strategic Analysis and Indicators: Conducting advanced analyses with SAS Analytics and redefining KPIs, increasing the accuracy of reported information.
- Data Integration and Governance: Leadership in reinsurance data integration projects (Generali), ensuring quality and compliance.
- ETL and Automation: Maintenance, optimization, and automation of ETL processes (SSIS), reducing consolidation time and increasing reliability.
- Modeling and Databases: Optimization of Oracle databases and development of a CRM Datamart (GVT-Telefónica merger), ensuring consistency and strategic alignment.
Skills, Tools and Technologies:
- SQL · SSIS · Teradata · Oracle · PowerCenter · Erwin Data Modele · SAS Enterprise Guide · Linux · Shell Script · Modelagem Dimensional · Data Governance
Main responsibilities:
- IT Project Management (Agile and Waterfall): Responsible for coordinating the delivery of IT projects (Agile and Waterfall) from software factories, managing all aspects of the project lifecycle, from requirements gathering to final delivery, ensuring that deliverables were aligned with client expectations, within established deadlines, and meeting quality requirements;
- Planning and Management of Deadlines and Budget: Led the definition of schedules, cost estimates, and resource allocation to ensure project execution within budget and deadlines, supervising all project phases, and adjusting plans as needed to mitigate risks and resolve impediments;
- Quality and Risk Management: Implemented quality control strategies to ensure delivered products met technical and business requirements. Conducted continuous risk assessments and implemented mitigation plans to minimize the impact of issues during the project lifecycle;
- Stakeholder and Communication Management: Effectively managed stakeholders, keeping everyone aligned with objectives and scope changes, ensuring clear and continuous communication;
- Development Team Management: Supervised a multidisciplinary team, facilitating communication among members and ensuring alignment with project objectives;
- Quality Assurance and Successful Deliveries: Actively validated deliverables, ensuring that developed solutions complied with client's technical and business requirements;
- Leadership in Test Monitoring and Improvement Implementation: Oversaw testing processes and implemented improvements, resulting in accurate and high-quality deliveries.
Skills, Tools and Technologies:
- Cobol · Project Planning · Oracle SQL Developer · Agile Methodologies · Problem Solving · Microsoft Excel · Oracle Database · Scrum · Kanban · Project Management · Continuous Improvement · Cross-functional Team Leadership · Teamwork · Microsoft Office · Information Technology
Main responsibilities:
- Technical Project Leadership: Responsible for providing technical guidance in projects, ensuring successful deliveries and high-quality solutions for the client MAPFRE Seguros, aligned with business expectations and needs. Worked closely with multidisciplinary IT teams, promoting alignment of project requirements, ensuring technical excellence, and delivering scalable and sustainable solutions;
- Support Management: Coordinated the support team, focusing on resolving failures in nightly processes, ensuring operational continuity and mitigating impacts in the production environment.
Skills, Tools and Technologies:
- Cobol · Oracle SQL Developer · Technical Leadership · DBMS · Microsoft Excel · Oracle Database · Testing · Shell Script · Linux · Teamwork · Information Technology · SQL
Main responsibilities:
- Continuous Improvement: Identified bottlenecks and implemented improvements in development and operations, optimizing performance and process efficiency;
- Technical Documentation: Created detailed technical documentation of processes and workflows, ensuring clarity and support for other teams;
- System Support: Worked on support processes, focusing on resolving failures in nightly routines, ensuring continuity and efficiency of operations, minimizing impacts, and implementing agile corrections.
Contributions and Results Achieved:
- Database Migration and Upgrade: Participated in the migration and upgrade project of the Oracle 7.0 database to the latest version, ensuring data integrity, consistency, and compliance.
Skills, Tools and Technologies:
- Cobol · Databases · Programming · Oracle SQL Developer · PL/SQL · ETL (Extract, Transform, Load) · DBMS · Storage · Oracle Database · Shell Script · Teamwork · Information Technology · Linux · Forms · Reports · SQL
Main responsibilities:
- System and Database Migration: Actively participated in the migration project from BPCS ERP (Oracle PL-SQL / Dataflex) to Microsiga/TOTVS Protheus ERP, including business rules mapping, requirements gathering, data mapping, and legacy data analysis. Developed transformation strategies to ensure data integrity and efficiency in the transition of the product module;
- Technical Documentation: Prepared detailed technical documentation on ETL processes, data mapping, and migration workflows, providing support for future teams and stakeholders;
- Multidisciplinary Collaboration: Worked closely with IT teams such as system analysts, DBAs, and developers to align project requirements and ensure migration success.
Contributions and Results Achieved:
- Migration completed with 0% failures in the product module load process, ensuring data consistency and compliance in the new system.
Skills, Tools and Technologies:
- Databases · ETL (Extract, Transform, Load) · DBMS · Microsoft Excel · Teamwork · Information Technology · Process Flow Documentation · SQL · PL/SQL · Linux · Dataflex
Main responsibilities:
- Process and ERP Management: Ensure alignment between business processes and proprietary ERP system, guaranteeing strategic and operational consistency.
- Requirements and Solutions: Gather and analyze requirements, identifying needs and proposing effective solutions.
- Data Analysis: Conduct reports and cross-checks of information to support decision-making.
- Technical and Functional Specifications: Prepare clear documentation aligned with project objectives.
- Coordination and Leadership: Manage technical teams and lead projects, ensuring efficient and high-quality deliveries.
Skills, Tools and Technologies:
- Unix · Databases · ETL (Extract, Transform, Load) · Unix Administration · Dataflex · Team Leadership · Teamwork · Information Technology · IT Infrastructure · Process Flow Documentation · SQL · PostgreSQL · MySQL · Visual Dataflex
Main responsibilities:
- Functional and Non-Functional Requirements: Analyze and define requirements aligned with business needs.
- Feature Development: Implement new functionalities in proprietary ERP, focusing on efficiency and usability.
- Documentation and Process Optimization: Produce technical documentation and optimize processes, ensuring compliance and standardization.
Skills, Tools and Technologies:
- Databases · Programming · Software Development · Continuous Improvement · Testing · Teamwork · Information Technology
Main responsibilities:
- System Maintenance and Support: Manage proprietary integration system (Factory x Dealership), ensuring operational continuity and efficient problem resolution.
- Documentation and Testing: Create manuals, technical documentation, and perform tests to guarantee quality and standardization.
- User Training: Train dealership users across Brazil in ERP, promoting efficiency and better tool usage.
Skills, Tools and Technologies:
- Customer Service · Databases · Client Support · DBMS · Microsoft Windows · Testing · Teamwork · Microsoft Office · Infrastructure · Technical Support · Information Technology
Technical Training Courses in Data and AI
| Course | Educational Institution |
|---|---|
| Machine Learning | São Judas Tadeu University |
| Databricks Fundamentals | Databricks Academy |
| Databricks for Data Engineering | Databricks Academy |
| Natural Language Processing | São Judas Tadeu University |
| Data Driven and Decision Process | São Judas Tadeu University |
| Big Data & Analytics | FIAP |
| Business Intelligence (BI) | FIAP |
| Neural Networks and Deep Learning | São Judas Tadeu University |
| AI900 - Microsoft Azure AI Fundamentals | Bradesco Foundation |
| Linear Algebra and Data Science | Getúlio Vargas Foundation |
| Data Engineering - Fundamentals | Data Science Academy |
| FluêncIA - Artificial Intelligence | Bradesco Foundation |
| AI Solutions on GitHub | Bradesco Foundation |
| Data Science and Artificial Intelligence - Fundamentals | Data Science Academy |
| Data-Driven Decision Making | Mackenzie University |
| Microsoft Fabric - Data Architecture and Engineering | Udemy |
| AI - Optimization Algorithms in Python | Udemy |
| NLP with spaCy and Python | Udemy |
| Python for Data Science | Udemy |
| Text Summarization with NLP | Udemy |
| Big Data and Artificial Intelligence - Fundamentals | Udemy |
| Power BI + DAX | Udemy |
| Artificial Neural Networks in Python | Udemy |
| Dimensional Modeling | Cetax Training |
Additional IT Training Courses
| Course | Educational Institution |
|---|---|
| Blockchain | FIAP |
| Advanced Blockchain | FIAP |
| Code Versioning with Git and GitHub | DIO Academy |
| Python Programming Language | Bradesco Foundation |
| Python Development | FIAP |
| AWS Cloud Practitioner Essentials | AWS Training and Certification |
| Python | FIAP |
| Google Cloud Associate Engineer Certification (GCP) | Udemy |
| Cloud Computing - Essentials | Udemy |
| Information Security | Indra - In Company |
| Agile Management with Scrum | Udemy |
| AZ900 - Microsoft Azure Fundamentals | Ka Solution |
| Big Data with Hadoop | Trainning Education Services |
| Scrum - Agile Project Management and Development | Impacta Tecnologia |
| Project Management | Tekno Software and Services |
| Oracle Developer Training | IC Training - Inter-Commerce |
| LCS - Linux Center Security (Red Hat) | Utah Computer Services |
| LCP - Linux Center Professional (Red Hat) | Utah Computer Services |
| LCA - Linux Center Administration (Red Hat) | Utah Computer Services |
| Supporting MS Windows NT 4.0 | Impacta Tecnologia |
| Visual Dataflex 7.0 | DataAccess - In Company |
| Novell Netware Network Administration | Impacta Tecnologia |
| Dataflex 3.1 | DataAccess - In Company |
| Visual Basic | Kyoei Facom |
| Structured Cobol | Kyoei Facom |
Cross-Training and Soft Skills Courses
| Course | Educational Institution |
|---|---|
| Effective Communication | Udemy |
| Ethics and Compliance | Indra - In Company |
| Design Thinking - Introductory Training | Udemy |
| Design Thinking from A to Z | Udemy |
| Ethics and Competition | Indra - In Company |
| Leanmaking White Belt | Indra - In Company |
| Sustainability | Indra - In Company |
| Team Formation and Management | IPEB Educational Institute |
Projects
Thank you for your interest in my work!
You can explore my projects in Data Engineering, Data Science, Artificial Intelligence and Machine Learning by visiting my online portfolio:
It is constantly evolving — both in navigation and in the addition of new projects — with the goal of sharing what I have been creating, learning, and improving throughout my professional journey.
In the meantime, feel free to get in touch — I'd be delighted to talk with you!
Contact
I am open to opportunities, projects, partnerships, and good conversations about data, technology, AI, and digital transformation.
You can reach me directly through the contacts below or, if you prefer, send me a message via the form.