Skip to main content
eeo icon

CDW is an equal opportunity/affirmative action employer committed to a diverse and inclusive workplace.
If you need assistance in applying for a position, please complete our accommodation request form.

Sr Data Software Engineer - SMITat CDW Careers

Job ID: 
23001021
Focus Area: 
Information Technology
Location: 
Remote, Remote
Remote Type: 

This job posting is no longer active

Service Management, Integration, and Transport (SMIT) is one the largest IT services programs for the Navy and U.S. Marine Corps (USMC), supporting approximately 400,000 seats around the world. The SMIT contractor provides enterprise services to the Marine Corps Enterprise Network (MCEN), Navy-Marine Corps Internet (NMCI), and OCONUS Navy Enterprise Network (ONE-Net) including transport, datacenter, network operations, service desk, cybersecurity, and managed services. CDW-G supports the SMIT contractor in several of these service areas.

The Sr Data Software Engineer - SMIT will play a pivotal role in building and operationalizing the minimally inclusive data necessary for the enterprise data and analytics initiatives following best practices. The bulk of the data engineer’s work would be in building, managing and optimizing data pipelines. Then moving these data pipelines effectively into production for key data and analytics consumers like business/data analysts, data scientists or any persona that needs curated data for data and analytics use cases across the enterprise. 

 

Key Areas of Responsibility

  • Develops and maintains scalable data pipelines to support continuing increases in data volume and complexity. 
  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using big data technologies and SQL 
  • Collaborates with business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. 
  • Collaborate with other technology teams to help engineer data sets that data science teams uses to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning. 
  • Responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. 
  • Train counterparts such as data scientists, data analysts, data consumers in data pipelines and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases. 
  • Helps ensure compliance and governance during use of data. 
  • Be curious and knowledgeable about new data management techniques and how to apply them to solve business problems. 

Required Qualifications

  • Strong experience with various Data Management architectures like Data Warehouse, Data Lake, Data Hub and the supporting processes like Data Integration, Governance, Metadata Management.
  • Extensive experience with popular data processing languages including SQL, PL/SQL, Python, others for relational databases and on NoSQL/Hadoop oriented databases like MongoDB, Cassandra, others for nonrelational databases.
  • Demonstrated ability to build rapport and maintain productive working relationships cross-departmentally and cross-functionally.
  • Demonstrated ability to coach and mentor others.
  • Excellent written and verbal communication skills with the ability to effectively interact with and present to all stakeholders including senior leadership.
  • Strong organizational, planning and creative problem solving-skills with critical attention to detail.
  • Demonstrated success of facilitation and solutions implementation.
  • History of balancing competing priorities with the ability to adapt to the changing needs of the business while meeting deadlines.

Preferred Qualifications

  • Extensive experience with Azure Data Factory
  • BS or MS degree in Computer Science or a related technical field. 
  • Extensive experience working with cloud platform (at least one of Azure, AWS, GCP)
  • Experience with ETL Tools (SSIS, Informatica, Ab Initio, Talend), Python, Databricks, Microsoft SQL Server Platform (version 2012 or later), and/or working in an Agile environment
Date Posted: Mar 21, 2023
Job Category: Engineering
People Leader v. Individual Contributor:
Travel Percentage: 0
 
Create Job Alert
Create Job Alerts