Posted on 
Nov 13, 2024

Software Senior Engineer - Big Data - Hybrid

Mid-Senior ICs
Engineering
Boomi
Boomi
Boomi
Private
1001-5000
Software, Security & Developer Tools

Boomi is the platform for intelligent connectivity and automation. Connect everyone to everything, anywhere.

Job Description

About us

Boomi’s Cloud API Management, the pioneer in the API management space, was born out of key business needs in the market to enable faster app development and partner growth, provide better, secure API performance, and deliver data-powered insights. We help customers launch their API programs quickly to get new products and services to market faster and connect with business partners more easily to foster innovation and expand routes to market. We can also deliver successful API performance with integrated security, and ensure that analytics and a performance dashboard help users understand what works, what doesn’t, and how to improve governance for the API program.

How You’ll Make An Impact

-----------------------------

As a Software Senior Engineer, you will be responsible for developing sophisticated Big Data systems and software based on the customer’s business goals, needs and general business environment. You will work with product management, other engineering teams, customer success and support on developing cutting edge new product features and enhancements across various areas of Boomi Cloud API Management services..

What You’ll Do

  • Be a member of an Agile team, collaboratively realizing features through the software development lifecycle processes and test driven development methodologies.
  • Participate in design, development, unit testing, and deployment of Boomi’s Big Data Services  including enhancements and/or resolution of any issues that may be reported
  • Investigate and resolve complex customer issues
  • Do Data Pipeline Development: Design, build, and maintain scalable data pipelines for processing large volumes of structured and unstructured data.
  • Build Data Storage: Develop and optimize data storage solutions using Hadoop, NoSQL, or cloud-based systems to handle large datasets.
  • Build Data Processing: Implement and maintain batch and real-time data processing systems using technologies like Apache Spark, Hadoop, or Kafka.
  • Do Data Integration: Integrate data from various internal and external data sources and ensure data consistency, quality, and security.
  • Do Performance Optimization: Monitor and improve the performance of data processing jobs, ensuring they run efficiently and at scale.
  • Do Collaboration: Work closely with data scientists, analysts, and stakeholders to understand data requirements and provide technical solutions.
  • Do Testing & Deployment: Implement automated testing and continuous integration/deployment (CI/CD) for big data systems to ensure robustness.
  • Implement Data Governance & Security: Ensure compliance with data governance and security policies while working with sensitive data.
  • Do Documentation: Prepare clear documentation for data pipelines, processes, and data systems architecture.
  • Work independently with a minimal level of guidance from technical leadership

The Experience You Bring

  • Experience in designing, developing and maintaining software products that process Big Data streaming pipelines with a solid understanding of map reduce architecture and software development lifecycle processes.
  • Proficiency in programming languages such as Java and Python for data processing.
  • Experience with cloud platforms like AWS, GCP, or Azure for deploying and scaling big data systems.
  • Experience with data modeling, understanding of schema design for big data applications and with messaging framework that can handle big data loads like Kafka and Kinesis.
  • Experience with real-time data processing and stream-processing frameworks like Spark and Flink.
  • Strong analytical skills and ability to solve complex problems related to big data processing and management.
  • Strong knowledge of SQL and NoSQL databases, including Cassandra, MongoDB, HBase, or similar technologies.
  • Experience with agile software development processes like Sprint models and collaboration tools, such as JIRA and Confluence.
  • Strong understanding of Computer Science fundamentals; Algorithms and Data structures is required.

 

 

Bonus Points If You Have

  • 5+ years of experience in software development
  • Familiarity with Devops tools like Harness, Jenkins and other CI/CD tools
  • Familiarity with Data ingestion tools like Fluentbit, Stash, Observability applications like Kibana, OpenSearch and Monitoring applications like Prometheus or New Relic.
  • Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus.

 

Location

Conshohocken, PA - Hybrid

Aren’t sure if you’re a match? We know that impostor syndrome and the confidence gap can prevent us from meeting spectacular candidates — so don’t hesitate to apply; you could be the perfect fit!

Compensation and Benefits

Boomi is committed to fair and equitable compensation practices. An overview of our benefits can be found here.

#LI-ES1

 

About Boomi and What Makes Us Special

Are you ready to work at a fast-growing company where you can make a difference? Boomi aims to make the world a better place by connecting everyone to everything, anywhere. Our award-winning, intelligent integration and automation platform helps organizations power the future of business. At Boomi, you’ll work with world-class people and industry-leading technology. We hire trailblazers with an entrepreneurial spirit who can solve challenging problems, make a real impact, and want to be part of building something big. If this sounds like a good fit for you, check out boomi.com  or visit our Boomi Careers page to learn more.

About us

Boomi’s Cloud API Management, the pioneer in the API management space, was born out of key business needs in the market to enable faster app development and partner growth, provide better, secure API performance, and deliver data-powered insights. We help customers launch their API programs quickly to get new products and services to market faster and connect with business partners more easily to foster innovation and expand routes to market. We can also deliver successful API performance with integrated security, and ensure that analytics and a performance dashboard help users understand what works, what doesn’t, and how to improve governance for the API program.

How You’ll Make An Impact

As a Software Senior Engineer, you will be responsible for developing sophisticated Big Data systems and software based on the customer’s business goals, needs and general business environment. You will work with product management, other engineering teams, customer success and support on developing cutting edge new product features and enhancements across various areas of Boomi Cloud API Management services..

What You’ll Do

  • Be a member of an Agile team, collaboratively realizing features through the software development lifecycle processes and test driven development methodologies.
  • Participate in design, development, unit testing, and deployment of Boomi’s Big Data Services  including enhancements and/or resolution of any issues that may be reported
  • Investigate and resolve complex customer issues
  • Do Data Pipeline Development: Design, build, and maintain scalable data pipelines for processing large volumes of structured and unstructured data.
  • Build Data Storage: Develop and optimize data storage solutions using Hadoop, NoSQL, or cloud-based systems to handle large datasets.
  • Build Data Processing: Implement and maintain batch and real-time data processing systems using technologies like Apache Spark, Hadoop, or Kafka.
  • Do Data Integration: Integrate data from various internal and external data sources and ensure data consistency, quality, and security.
  • Do Performance Optimization: Monitor and improve the performance of data processing jobs, ensuring they run efficiently and at scale.
  • Do Collaboration: Work closely with data scientists, analysts, and stakeholders to understand data requirements and provide technical solutions.
  • Do Testing & Deployment: Implement automated testing and continuous integration/deployment (CI/CD) for big data systems to ensure robustness.
  • Implement Data Governance & Security: Ensure compliance with data governance and security policies while working with sensitive data.
  • Do Documentation: Prepare clear documentation for data pipelines, processes, and data systems architecture.
  • Work independently with a minimal level of guidance from technical leadership

The Experience You Bring 

  • Experience in designing, developing and maintaining software products that process Big Data streaming pipelines with a solid understanding of map reduce architecture and software development lifecycle processes.
  • Proficiency in programming languages such as Java and Python for data processing.
  • Experience with cloud platforms like AWS, GCP, or Azure for deploying and scaling big data systems.
  • Experience with data modeling, understanding of schema design for big data applications and with messaging framework that can handle big data loads like Kafka and Kinesis.
  • Experience with real-time data processing and stream-processing frameworks like Spark and Flink.
  • Strong analytical skills and ability to solve complex problems related to big data processing and management.
  • Strong knowledge of SQL and NoSQL databases, including Cassandra, MongoDB, HBase, or similar technologies.
  • Experience with agile software development processes like Sprint models and collaboration tools, such as JIRA and Confluence.
  • Strong understanding of Computer Science fundamentals; Algorithms and Data structures is required.

 

 

Bonus Points If You Have

  • 5+ years of experience in software development
  • Familiarity with Devops tools like Harness, Jenkins and other CI/CD tools
  • Familiarity with Data ingestion tools like Fluentbit, Stash, Observability applications like Kibana, OpenSearch and Monitoring applications like Prometheus or New Relic.
  • Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus.

 

Location

Conshohocken, PA - Hybrid

Aren’t sure if you’re a match? We know that impostor syndrome and the confidence gap can prevent us from meeting spectacular candidates — so don’t hesitate to apply; you could be the perfect fit!

Compensation and Benefits

Boomi is committed to fair and equitable compensation practices. An overview of our benefits can be found here.

#LI-ES1

 

 

Be Bold. Be You. Be Boomi. We take pride in our culture and core values and are committed to being a place where everyone can be their true, authentic self. Our team members are our most valuable resources, and we look for and encourage diversity in backgrounds, thoughts, life experiences, knowledge, and capabilities.  

All employment decisions are based on business needs, job requirements, and individual qualifications.

Boomi strives to create an inclusive and accessible environment for candidates and employees. If you need accommodation during the application or interview process, please submit a request to talent@boomi.com. This inbox is strictly for accommodations, please do not send resumes or general inquiries. 

Receive Tech Ladies'
newest jobs in your inbox,
every week.

Join Tech Ladies for full-access to the job board, member-only events, and more!

If you're already a member, we haven't forgotten you. We promise. It's a new system. If you fill out the form once, it'll remember you going forward. Apologies for the inconvenience.

No items found.
No items found.
Engineering
Engineering
Hybrid
Hybrid