UK.JobDiagnosis logo
  • Login
  • Lost Login?
  • Join Today
Job title, industry, keywords, etc.
City, State or Postcode

Data Specialist (Belfast)

Morgan McKinley - lisburn, antrim

Apply Now

Job Description

Markets Program Execution & Transformation – Data Acquisition TeamBelfast (Hybrid) - Essential | Approx. £415pd PAYE + Holiday (PAYE) The Markets Program Execution & Transformation team partners with all Global Markets businesses and various functions—including Legal, Compliance, Finance, and Operations & Technology—to identify, mobilize, and deliver regulatory and cross-business transformation initiatives. The team's core mission is to design and implement integrated solutions that are efficient, scalable, and client-focused.This role sits within the Data Acquisition Team, playing a critical part in supporting multiple projects and workstreams. The focus is on assessing and delivering robust data solutions and managing changes that impact diverse stakeholder groups in response to regulatory rulemaking, supervisory requirements, and discretionary transformation programs.Key Responsibilities:Develop PySpark and SQL queries to analyze, reconcile, and interrogate data.Provide actionable recommendations to improve reporting processes—e.g., enhancing data quality, streamlining workflows, and optimizing query performance.Contribute to architecture and design discussions in a Hadoop-based environment.Translate high-level architecture and requirements into detailed design and code.Lead and guide complex, high-impact projects across all stages of development and implementation while ensuring adherence to key processes.Champion continuous improvement in areas such as code quality, testability, and system reliability.Act as a subject matter expert (SME) to senior stakeholders and cross-functional teams.Produce key project documentation, including Business Requirements Documents (BRDs), Functional Requirements Documents (FRDs), UAT plans, test scenarios, and project plans for technical deliverables.Manage day-to-day project activities, including setting milestones, tracking tasks, coordinating deliverables, and ensuring timely, high-quality execution.Required Skills & Experience:Proficiency in SQL, Python, and Spark.Minimum 5 years of hands-on technical data analysis experience.Familiarity with Hadoop/Big Data environments.Understanding of Data Warehouse/ETL design and development methodologies.Ability to perform under pressure and adapt to changing priorities or requirements.Strong communication skills—capable of producing detailed documentation and translating complex technical issues for non-technical audiences.Self-motivated and able to work independently.Preferred Qualifications:Background in investment banking or financial services.Hands-on experience with Hive, Impala, and the Spark ecosystem (e.g., HDFS, Apache Spark, Spark-SQL, UDFs, Sqoop).Proven experience building and optimizing big data pipelines, architectures, and data sets.

Created: 2025-08-01

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2025 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.