Motion Recruitment | Jobspring | Workbridge

Data Engineer with Mulesoft

Hoboken, New Jersey

Open to Remote

Direct Hire

$120k - $135k

Day in the life of a Data Engineer starts with building and delivering high quality data architectures and pipelines that support clients, business analysts, and data scientists. You also work with other technology teams to extract, transform, and load data from several data sources. Data Engineers improve ongoing reporting and processes and automate or simplify self-service for our clients. You will develop, code, and deploy scripts written in the Python programming language, as Python is the language of Data.

Responsibilities
  • Develop, construct, test and maintain data architectures from the data architect
  • Analyze organic and raw data
  • Build data systems and pipelines
  • Build the infrastructure required for extraction, transformation, and loading of data from different data sources using SQL and AWS ‘big data’ technologies
  • Write scripts for data architects, data scientists, and data quality engineers
  • Data acquisition
  • Identify ways to improve data reliability, efficiency, and quality
  • Develop data-set processes
  • Prepare data for prescriptive and predictive modeling
  • Automate the data collection and analysis processes, data releasing and reporting tools
  • Build algorithms and prototypes
  • Develop analytical tools and programs
Qualifications
  • Bachelor’s or master’s degree in computer science, Engineering or a related field
  • MuleSoft Certified Integration Associate and/or MuleSoft Certified Developer – Level 1
  • Five years plus experience working in AWS as a Data Engineer
  • Experience working as a Data Engineer in a professional services or consulting environment.
  • Proficiency in programming languages such as Python, Java, or Scala, with expertise in data processing frameworks and libraries (e.g., Spark, Hadoop, SQL)
  • In-depth knowledge of database systems (relational and NoSQL), data modeling, and data warehousing concepts
  • Experience with cloud-based data platforms and services (e.g., AWS, Azure) including familiarity with relevant tools (e.g., S3, Redshift, BigQuery, etc.)
  • Proficiency in designing and implementing ETL processes and data integration workflows using tools like Apache Airflow, Informatica, or Talend
  • Familiarity with data governance practices, data quality frameworks, and data security principles
  • Work minimal direction and turn a clients want and need into working stories, epics which can be performed upon during a sprint.
  • A firm understanding of the SDLC process
  • An understanding of object-oriented programming
Company Offered Benefits

Full-time employees are eligible to participate in our employee benefit programs:

  • Medical, dental, and vision health insurances,
  • Short term disability, long term disability and life insurances,
  • 401k with Company match
  • Paid time off (PTO) (120 hours PTO that accrue over one year)
  • Paid time off for major holidays (14 days per year)
  • These and any other employee benefit offerings are subject to management's discretion and may change at any time.
#LI-PB1

Posted by: Paddy Beauchamp

Specialization: