technology job

Senior Data Scientist- Python, SQL, ML & GCP - Fully Remote

  • Posted April 24, 2024
  • £700 - £800 per day
  • Fully remote based
  • Contract

Senior Data Scientist- Python, SQL, ML & GCP – Fully Remote

Daily rate £700-£800 via Umbrella

Global Ecommerce business is looking to recruit a Senior Data Scientist to be responsible for establishing best practices and developing/deploying core ML/AI algorithms required to tackle data science challenges.

Experience/ skills:

  • Significant experience working as a Senior Data Scientist
  • Solid knowledge of SQL and Python’s ecosystem for data analysis (Jupyter, Pandas, Scikit Learn, Matplotlib).
  • Proven GCP experience (developing GCP machine learning services)
  • Solid understanding of computer science fundamentals, including data structures, algorithms, data modelling and software architecture
  • Strong knowledge of Machine Learning algorithms (e.g. Logistic Regression, Random Forest, XGBoost, etc.) as well as state-of-the-art research area (e.g. NLP, Transfer Learning etc.) and modern Deep Learning algorithms (e.g. BERT, LSTM, etc.)

Any Retail or Ecommerce experience is highly desirable.

Data Scientist/ ML/ AI/ Python/ SQL/ Algorithms/ GCP #NLP

Please get in touch for more details- [email protected]

Apply for this Job

    technology job

    Data Engineer

    • Posted April 22, 2024
    • £45000 - £54000 per annum + + Bonus
    • Permanent

    Job Title: PySpark Data Engineer – Azure

    Salary: up to £54,000

    Mostly remote

    About Us: Join our innovative team where PySpark expertise meets Azure ingenuity. We’re passionate about leveraging data to drive business success and looking for a skilled Data Engineer to join us in delivering high-impact solutions.

    Key Responsibilities:

    • Develop secure, efficient data pipelines using PySpark for ingestion, transformation, and consumption within Azure.
    • Ensure data quality and adherence to best practices throughout the pipeline life cycle.
    • Design and optimise physical data models to meet business needs and storage requirements.
    • Collaborate with cross-functional teams to deliver BI solutions and reporting structures using Power BI.

    Experience & Qualifications:

    • 2-5 years of experience in designing and implementing PySpark-based data solutions.
    • Proficiency in SQL and Azure technologies (Data factory, Synapse).
    • Strong understanding of data life cycle management and CI/CD principles.
    • Experience working with large, event-based data sets in enterprise environments.
    • Excellent communication skills with a passion for leveraging data to drive business value.

    Skills & Attributes:

    • Creative problem solver with a proactive and optimistic mindset.
    • Strong time management and organisational skills, capable of handling multiple priorities.
    • Enthusiastic team player committed to delivering shared outcomes.
    • Ability to mentor and support less experienced engineers.
    • Power Bi Knowledge a bonus

    What We Offer:

    • Competitive salary and benefits package.
    • Opportunities for professional growth and advancement.
    • A collaborative and supportive work environment where your ideas are valued.

    Join Us: If you’re ready to harness the power of PySpark and Azure to shape the future of data analytics, apply now! Let’s make it happen together.

    Apply for this Job

      technology job

      Data Modeller- Source to Target- Fully Remote

      • Posted April 19, 2024
      • £550 - £600 per day
      • Fully remote based role
      • Contract

      Data Modeller– Source to Target- Fully Remote

      £550-£600 via Umbrella

      Global leading business is looking for a Data Designer/ Data Modeller to carry out data design.

      Data Modeller- Role:

      • Create source to target mapping
      • Physical Data Modelling (physical rather than conceptual as the target architecture is already well defined)
      • Documentation (e.g documenting the agreed way the data will be formatted)
      • Work closely with the business and systems SME’s to understand what they want to achieve from the data
      • Suggest Data Quality rules
      • Contribute to the optimization of data storage, retrieval, and processing.

      Experience/Skills:

      • Solution focused Data Designer with proven data modelling experience
      • Excellent communication and stakeholder management skills
      • Ideally you would have worked in a Consumer facing sector
      • Strong SQL skills and understanding of database management systems.

      Any Databricks experience is highly desirable.

      #datadesigner #dataarchitect #datamodelling #physicalmodels #datamapping #kimball #systems #applications #SQL#databricks

      Please get in touch for more details- [email protected]

      Apply for this Job

        technology job

        Data Engineer

        • Posted
        • £40000 - £55000 per annum + Benefits
        • Manchester
        • Permanent

        Job Title: Data Engineer
        Location: Manchester
        Package: from £40,000 – £55,000 + Benefits
        Type: Permanent

        Sanderson Recruitment is recruiting for a Data Engineer on behalf of our leading Insurance client based in Manchester.

        Company Overview:
        Are you interested in joining a leading insurance company headquartered in the UK? Established over a decade ago, my client specialises in providing a range of insurance services tailored to meet the diverse needs of their customers.

        With a primary focus on the motor insurance market, they offer comprehensive car insurance directly through their brand, as well as underwriting services to other insurers. In addition to motor insurance, they also provide various supporting services related to insurance, including financing, distribution, and legal assistance. My client’s commitment to utilising technology and data-driven strategies ensures they deliver high-quality products and services to their customers while mitigating risks effectively.

        Role & Responsibilities:
        As a Data Engineer, you will be actively participating in technical tasks, focusing on constructing data solutions for projects and ongoing data products. Your responsibilities will include…

        • Develop secure, efficient data pipelines of varying complexity, integrating data from diverse sources, both on-premise and off-premise, internal and external.
        • Ensure data integrity and quality by cleansing, mapping, transforming, and optimising data for storage, aligning with business and technical requirements.
        • Incorporate data observability and quality measures into pipelines to facilitate self-testing and early detection of processing issues or discrepancies.
        • Construct solutions to transform and store data across different storage areas, including data lakes, databases, and reporting structures, spanning data warehouse, business intelligence systems, and analytics applications.
        • Design physical data models tailored to business needs and storage optimisation, emphasising reusability and scalability.
        • Conduct thorough unit testing of own code and peer testing to maintain high quality and integrity.
        • Document pipelines and code comprehensively to ensure transparency and facilitate understanding.
        • Adhere to coding standards, architectural principles, and release management processes to ensure code safety, quality, and compliance.
        • Provide guidance and support to Associate data engineers through coaching and mentoring.
        • Develop BI solutions of varying complexity, including data marts, semantic layers, and reporting & visualisation solutions using recognised BI tools such as PowerBI.

        Essential Requirements:
        To thrive in this role, candidates must possess:

        • Demonstrated proficiency in PySpark and SQL development, with a strong interest in advancing your career in data engineering.
        • Enthusiastic about leveraging Azure best practices to facilitate seamless data delivery from source to consumption on a daily basis.
        • Excels at translating customer requirements into actionable designs and timely delivery.
        • 2-5 years of experience in designing and implementing end-to-end data solutions.
        • Proficiency in SQL Server and Azure technologies such as Data Factory and Synapse, along with expertise in associated ETL technologies.
        • Experience working with large, event-based datasets within an enterprise setting.
        • Familiarity with testing techniques and tools to ensure data quality and integrity.
        • Strong interpersonal and communication skills, with an ability to build strong relationships.
        • Active engagement in the data community with a keen interest in leveraging data to drive business value.
        • Comprehensive understanding of the complete data lifecycle.
        • Experience with Continuous Integration / Continuous Delivery (CI/CD) practices.
        • Proven track record of thriving in agile environments and adeptness at self-managing teams.

        This role offers an exciting opportunity to drive data innovation within a forward-thinking organisation. If you’re ready to make a direct and meaningful contribution to my clients dynamic work environment, apply now.

        Apply for this Job

          technology job

          Data Engineer- SQL/ Python / Azure

          • Posted April 18, 2024
          • £40000 - £50000 per annum + Excellent Benefits
          • Permanent

          Data Engineer- SQL/ Python / Azure

          £40k-£50k + Excellent Benefits

          Bristol- couple of times per month required onsite

          Are you passionate about leveraging cutting-edge technologies to drive innovation and solve complex data challenges? Do you thrive in a dynamic environment where your contributions make a tangible impact?

          If so, I am working with a successful Scale up business who are looking to bring in a Data Engineer to the growing team.

          Role:

          • Design, develop, and maintain scalable data pipelines using Azure Synapse, Azure Data Lake Storage (ADLS), and Apache Spark to support various data-driven initiatives.
          • Collaborate with cross-functional teams to understand data requirements and implement efficient solutions to ingest, transform, and analyse large volumes of data.
          • Utilize Azure DevOps for continuous integration and continuous deployment (CI/CD) of data pipelines and infrastructure as code (IaC) using Terraform.
          • Leverage Python and SQL to write robust, efficient, and maintainable code for data processing, analysis, and reporting.
          • Design and implement data models and schemas following Medallion Architecture principles to ensure data integrity, consistency, and reliability.

          Experience/Skills:

          • Experience working as a SQL Developer or Data Engineer
          • While experience with Azure Synapse, ADLS, Azure DevOps, Apache Spark, Python, SQL, Terraform, and Medallion Architecture is preferred, don’t worry if you don’t have direct experience with all technologies listed. My client values individuals who are eager to learn and grow, and we provide opportunities for training and development.
          • Strong problem-solving skills and attention to detail.
          • Excellent communication and collaboration skills, with the ability to work effectively in a team environment.

          #sqldeveloper #dataengineer #sql #python #azure #adls #apache #synapase #terraform

          Please get in touch for more details- [email protected]

          Apply for this Job

            technology job

            Data Architect

            • Posted April 17, 2024
            • £650 - £750 per day
            • Data Architect - Financial Services £650-£750 pd via Umbrella / Initially for 6 months with long term potential Wiltshire- Hybrid working
            • Contract

            Data Architect – Financial Services

            £650-£750 pd via Umbrella / Initially for 6 months with long term potential

            Wiltshire- Hybrid working

            A leading Financial Services business is looking to recruit a Data Architect to develop and implement data strategy.

            Role:

            • Act as a lead Architect on major data programs
            • Deliver and recommend design options, and architectural blueprints/artefacts that enable the architecture to be pragmatically developed and operated
            • Develop reference architectures, principles and standards
            • Collaborate with Solution, Enterprise, and Business Architects to design and deliver forward-thinking data solutions
            • Ensure effective data structuring and alignment with technology strategy by closely collaborating with operational teams, while providing support to internal staff, partners, and clients, leveraging Salesforce and industry automation tools

            Skills/ Experience:

            • Significant experience working as a Data Architect within a regulated/ Financial Services environment
            • Technical proficiency in data modeling, SQL, and integration tools
            • Strong data modelling experience (conceptual, logical, and physical)
            • Extensive experience of delivering solutions and developing data solutions leveraging cloud components (AWS or Azure)
            • Excellent Stakeholder Management & Communication skills.
            • Experience with a variety of database technologies – (Relational, OLAP, OLTP NoSQL and Graph)
            • 10+ years of experience in a development environment, ideally in financial services

            Please get in touch- [email protected]

            Apply for this Job

              technology job

              Azure Data Engineer

              • Posted April 10, 2024
              • £450 - £500 per day + Outside IR35
              • Azure Data Engineer - 6 Month Contract - £450-550 Outside IR35 - Fully Remote
              • Contract

              Seeking an Azure Data Engineer for a 6-month contract with a leading business. Rate: £450-£500 outside IR35. One day a month in London, fully remote otherwise.

              Key Experiences:

              • Azure Data Lakes Expertise
              • Proficiency in Azure Cloud Services
              • Terraform Mastery

              Responsibilities:

              • Develop & Maintain Ingestion & ETL Processes
              • Optimise Data Platform Pipelines
              • Scale Cloud Data Platforms
              • Implement Enterprise Data Modelling Approach
              • Collaborate with Teams for Solution Development
              • Support Data Privacy & Compliance
              • Develop Strategic Data Platforms & Reporting

              Requirements:

              • Proven Experience in Ingestion & ETL Processes
              • Expertise in Performance Optimisation
              • Strong Background in Cloud Data Platforms
              • Terraform Proficiency
              • Understanding of Data Privacy Regulations
              • Experience of working for Public Sector beneficial

              Ready to dive into an innovative project? Apply now!

              Apply for this Job

                technology job

                Senior Data Scientist- Python, SQL, ML & GCP - Fully Remote

                • Posted April 9, 2024
                • £700 - £800 per day
                • Fully Remote
                • Contract

                Senior Data Scientist- Python, SQL, ML & GCP – Fully Remote

                Daily rate £700-£800 via Umbrella

                Global Ecommerce business is looking to recruit a Senior Data Scientist to be responsible for establishing best practices and developing/deploying core ML/AI algorithms required to tackle data science challenges.

                Experience/ skills:

                • Significant experience working as a Senior Data Scientist
                • Solid knowledge of SQL and Python’s ecosystem for data analysis (Jupyter, Pandas, Scikit Learn, Matplotlib).
                • Proven GCP experience (developing GCP machine learning services)
                • Solid understanding of computer science fundamentals, including data structures, algorithms, data modelling and software architecture
                • Strong knowledge of Machine Learning algorithms (e.g. Logistic Regression, Random Forest, XGBoost, etc.) as well as state-of-the-art research area (e.g. NLP, Transfer Learning etc.) and modern Deep Learning algorithms (e.g. BERT, LSTM, etc.)

                Any Retail or Ecommerce experience is highly desirable.

                Data Scientist/ ML/ AI/ Python/ SQL/ Algorithms/ GCP

                Please get in touch for more details- [email protected]

                Apply for this Job

                  technology job

                  BI Analyst - Power BI & BigQuery

                  • Posted
                  • £500 - £600 per day
                  • South West - mostly remote, a couple of times per month onsite
                  • Contract

                  BI Analyst – Power BI & BigQuery
                  £500-£600 pd via Umbrella

                  South West – mostly remote, a couple of times per month onsite

                  Leading business is looking to bring in a BI Analyst at this busy time as the

                  Role:

                  • Design, develop, and optimize data models and queries in BigQuery to extract meaningful insights from large datasets.
                  • Create visually compelling reports and dashboards in Power BI to communicate key metrics and trends.

                  Skills/experience:

                  • Proven Power BI, BigQuery & SQL skills
                  • Excellent communication and storytelling presentation skills to effectively convey complex data insights to diverse audiences.
                  • Ability to build strong relationships and collaborate effectively with stakeholders across departments.
                  • Skillful in identifying and addressing data-related challenges to optimize business performance

                  Any Google Analytics (GA4) is a bonus.

                  #powerbi #BigQuery #SQL #G4A #buildreports #dashboards

                  Please get in touch for more details – [email protected]

                  Apply for this Job

                    technology job

                    Data Engineer - SC Cleared - OUTSIDE IR35

                    • Posted April 5, 2024
                    • £480 - £520 per day
                    • London (hybrid up to 2 days a week)
                    • Contract

                    Role: Data Engineer

                    Day Rate: £480 – £520p/d (dependant on SFIA Level experience) – OUTSIDE IR35

                    Location: Central London (hybrid up to 2 days a week)

                    Clearance required: ACTIVE SC clearance

                    Duration: 6 month SoW – with scope for extension (3 year project)

                    Are you a Data Engineer, with strong Kafka/ Streaming data experience, looking to work for a high-calibre Consultancy pioneering innovation in the Public Sector?

                    We’re on the look out for SC cleared Data Engineers, who have experience in

                    • AWS – S3 for storage, Lambda functions, Athena (thus strong SQL)
                    • Python
                    • Current frontend is Tableau but experience with any of them will be interchangeable
                    • Devops experience a bonus e.g. Terraform, Drone, Kubernetes cluster management for microservice style API data consumption
                    • Consultative behaviour – strong stakeholder engagement skills

                    If you hold active SC clearance – please do get in touch to find out more: [email protected]

                    Apply for this Job