US
0 suggestions are available, use up and down arrow to navigate them
Data Architect - Remote / Telecommute...

Apply to this job.

Think you're the perfect candidate?

Data Architect - Remote / Telecommute

Cynet Systems Philadelphia, PA (Remote) Full-Time
Job Description:

Pay Range: $65hr - $70hr

Responsibilities:
  • Data Architecture Design: Design and develop scalable, high-performance data architecture solutions using PySpark, ADF, and Power BI to support business intelligence, analytics, and reporting needs
  • Data Pipeline Development: Build and manage robust data pipelines using PySpark and Azure Data Factory, ensuring efficient data extraction, transformation, and loading (ETL) processes across various data sources
  • Data Modeling: Develop and maintain data models that optimize query performance and support the needs of analytics and reporting teams
  • Integration and Automation: Design and implement integration strategies to automate data flows between systems and ensure data consistency and accuracy
  • Collaboration: Work closely with data engineers, data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver effective solutions
  • Data Governance and Security: Ensure data solutions adhere to best practices in data governance, security, and compliance, including data privacy regulations and policies
  • Performance Optimization: Continuously monitor and optimize data processes and architectures for performance, scalability, and cost-efficiency
  • Reporting Visualization: Utilize Power BI to design and develop interactive dashboards and reports that provide actionable insights for business stakeholders
  • Documentation: Create comprehensive documentation for data architecture, data flows, ETL processes, and reporting solutions
  • Troubleshooting Support: Provide technical support and troubleshooting for data-related issues, ensuring timely resolution and minimal impact on business operations
Qualifications:
  • Bachelor s degree in Computer Science, Information Technology, Data Science, or a related field
Experience:
  • Minimum of 15+ years of experience in data architecture and engineering, with a focus on PySpark, ADF, and Power BI
  • Proven experience in designing and implementing data pipelines, ETL processes, and data integration solutions
  • Strong experience in data modeling and data warehouse design
Technical Skills:
  • Proficiency in PySpark for big data processing and transformation
  • Extensive experience with Azure Data Factory (ADF) for data orchestration and ETL workflows
  • Strong expertise in Power BI for data visualization, dashboard creation, and reporting
  • Knowledge of Azure services (e.g., Azure Data Lake, Azure Synapse) and other relevant cloud-based data technologies
  • Strong SQL skills and experience with relational databases
    Soft Skills: Excellent problem-solving and analytical skills
  • Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams
  • bility to manage multiple priorities in a fast-paced environment
Preferred Qualifications / Certifications:
  • Microsoft certifications related to Azure, Power BI, or data engineering are a plus
    Experience in a similar role within a large enterprise environment is preferred

Recommended Skills

  • Analytical
  • Automation
  • Big Data
  • Business Intelligence
  • Communication
  • Data Analysis

Apply to this job.

Think you're the perfect candidate?

Help us improve CareerBuilder by providing feedback about this job: Report this job

Job ID: 4dcce114a106bc62f4c3e7922

CareerBuilder TIP

For your privacy and protection, when applying to a job online, never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction. Learn more.

By applying to a job using CareerBuilder you are agreeing to comply with and be subject to the CareerBuilder Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.