The Prontoux Way

Careers

Connect, Grow and Bring your passion to accelerate your career at Prontoux
Current Opportunities
# Job ID Title Skills Years of Experience Job Location Apply
  • Job Description: Job Description: Understanding of Oracle Install Base business transactions. Requirement gathering and analyzing the impact across cross-functional team. Work with multiple teams on solutions with focus on optimization and efficiency management. Data analysis and use case creation Prepare functional specification document / User Story Work with Dev & QA on deliverable Exposure to Oracle Cloud a plus. Certified scrum master is a plus. Good in SQL, Reporting tools and Cisco Knowledge Strong ability in translating business requirements and needs into analytic solutions, within multiple areas in IT and with various stakeholders, including key leaders and managers. • Leverage data to understand in depth IT business processes, identify areas of opportunity for process improvement. * Develop deep understanding of analytical data models. *Knowledge in data science and designing data modeling. * Support project development life cycles through data modeling, reporting and analytics. • Participate in the on-going development of the business intelligence and data warehousing functions within the wider organization. • Create training materials to guide business users on how to use dashboards. • Participate in the creation and support of development standards and best practices. • Explore and recommend emerging technologies and techniques to support/enhance BI landscape components. * Automate solutions where appropriate. Skills * At least 4-6 years of business intelligence and data warehouse experience. * At least 2-year experience with ANSI SQL/ Presto / Hive/ MySQL. * Prefer a candidate with scripting experience (Python/R/Javascript/PHP/ Perl/Ruby/etc.) * Prefer a candidate with experience building and maintaining pipelines * Knowledge of ETL processes and designs. • Write queries, analyze, visualize, and provide analytics on data to build reporting solutions to support various company initiatives. E.g., build rich and dynamic dashboards using Tableau.
  • Job Description: Job Description: Strong hands-on development experience ( 7 to 8 years) At least 5-6 years of Sales, Marketing modules exp Must have experience with sales and marketing, contact modules experience Admin/developer certifications Strong experience in inflows, UI, user security, and DevOps in SFDC. Strong experience in microservices, APIs integration IN/out with sales force, publishing the APIs Understand config vs code, and guide teams for configs Good to have Cisco Experience
  • Job Description: Job Description: Strong hands-on development experience ( 7 to 8 years) At least 5-6 years of Sales, Marketing modules exp Must have experience with Sales console, VDC console experience Admin/developer certifications Strong experience in inflows, UI, user security, and DevOps in SFDC. Strong experience in microservices, APIs integration IN/out with sales force, publishing the APIs Understand config vs code, and guide teams for configs Good to have Cisco Experience
  • Job Description: Job Description: UI development, preferably Angular and React is must MongoDB Atlas Big Data Broker (BDB) Python 3.8 JSONPath RegEx Github dealing with APIs: Intersight, CSOne, BDB, Borg, RMA, Sherlock, Webex, etc. Ascension Orchestrator Platform ( Previously: Action Orchestrator) Production support experience is a plus
  • Job Description: Job Description: We are looking for a highly capable machine learning engineer to optimize our machine learning systems. You will be evaluating existing machine learning (ML) processes, performing statistical analysis to resolve data set problems, and enhancing the accuracy of our AI software's predictive automation capabilities. To ensure success as a machine learning engineer, you should have data science knowledge and experience in a related ML role. Machine Learning Engineer Responsibilities: • Consulting with managers to determine and refine machine learning objectives. • Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models. • Transforming data science prototypes and applying appropriate ML algorithms and tools. • Ensuring that algorithms generate accurate user recommendations. • Turning unstructured data into useful information by auto-tagging images and text-to-speech conversions. • Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks. • Developing ML algorithms to analyze huge volumes of historical data to make predictions. • Running tests, performing statistical analysis, and interpreting test results. • Documenting machine learning processes. • Keeping abreast of developments in machine learning. Machine Learning Engineer Requirements: • Bachelor's degree in computer science, data science, mathematics, or a related field. • Master’s degree in computational linguistics, data analytics, or similar will be advantageous. • Experience as a machine learning engineer. • Advanced proficiency with Python, Java, and R code writing. • Extensive knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture. • In-depth knowledge of mathematics, statistics, and algorithms. • Superb analytical and problem-solving abilities. • Great communication and collaboration skills. • Excellent time management and organizational abilities.
  • Job Description: Job Description: Must Have 10+ Yrs Exp. Expert level Python programming and debugging skills. Familiarity with REST APIs, Cloud Infrastructure, CI/CD, Git. Familiarity with UI development, frameworks (React/Angular, Flask, PHP, JS, etc.). Required Technical Skills: Must Have: Python Flask (API framework) AIO http (Python based async web framework) OpenShift CAE (Red hat-based Kubernetes platform) Redis (for Caching) Snowflake (for storing data) Good to have: RASA (NLP framework) Docker (container) Elasticsearch Grafana Prometheus
  • Job Description: Job Description: This role is more about Budget/Portfolio management and good program management skills with Finance knowledge on how Cisco budgets are handled and maintained with multiple cross-functional team (finance, Biz Ops managers) and coordination along with good analytical/reporting skills for management reporting
  • Job Description: A Java Developer is a programmer who designs, develops, and manages Java-based applications and software. With most large organizations using Java to implement software systems and backend services. Roles and Responsibility: A developer is responsible for several Java-related duties throughout the software development lifecycle, from concept and design to testing. The developer is required to create user information solutions through the development, implementation, and maintenance of Java-based components and interfaces. Java developer roles will vary greatly depending on companies and job positions. Here are some typical roles and responsibilities of java developers: • Contribute to all stages of software development lifecycle • Design, implement and maintain Java-based applications that can be high-volume and low-latency • Analyze user requirements to define business objectives • Envisioning system features and functionality • Define application objectives and functionality • Ensure application designs conform with business goals • Develop and test software • Identify and resolve any technical issues arising
  • Job Description: Looking for a Project Manager with below skills and experience: Experience in Agile/Scrum methodologies, Jira tool Minimum 3-5 years of PM experience. Excellent communication, with experience in client-facing roles · Go-getter, open to challenges Experience in data & analytics, database Understands business side of projects/programs · Strong knowledge of IT solutions and release/program management Passion for innovation and driving disruptive change Ability to work across global teams and collaborate virtually Critical thinking, judgment, coaching and influencing skills that converge to create executive presence Effective and efficient communication and presentation skills with the ability to present information at executive level Business process analysis and design Understanding of change management fundamentals Ability to multi-task in an environment with a high degree of change and ambiguity Scrum Master certified
  • Job Description: Job Description: 4+ years Experience in Azure DevOps Agile/Kanban practices. Experience in DevOps for data platform tools and technologies like Azure ADF, Python, Snowflake Collaborate with Product owners, developers, cloud engineers and other DevOps engineers and operations to plan, design, test and deliver pipelines and infrastructure using the Continuous Integration/Continuous Delivery (CI/CD) model. Responsible for enhancing and maintaining highly available continuous integration & continuous deployment (CI/CD) pipelines and development infrastructure Develops self-service solutions to support the delivery of software with great speed, security, reliability, and quality Develop and maintain system deployment automation processes to enable teams to deploy, manage, configure, scale, and monitor their applications Drive DevOps automation and containerization strategies that align with DevOps principles and standards. Perform analysis of DevOps practices, identify gaps and impediments for continuous integration and delivery Develop documentation of CI/CD pipelines and communication to dev teams on pipelines changes
  • Job Description: Job Description: 4+ Years Experience enabling data quality automation, scoring models, comparator testing, dashboards enablement, data quality reporting - any tool is good , Ataccama is preferred Support with data profiling, metadata creation Partner with data stewards and product managers helping to improve DQ score
  • Job Description: Job Description: ● Setup a brand new visualization platform. ● Participate in visualization platform technology selection, and responsible for design, deploy and maintain the platform. ● Migrate ~100+ existing Power BI Dashboards to the new platform. ● Draft platform governance, reports and dashboards publish procedures. ● Solid experience in Power BI content development and SQL. ● Understanding of the infrastructure requirements of setting up Power BI Desktop, Power BI Services and Power BI Embedded. ● Hands-on workspaces reports and dashboard design on Power BI Desktop and Power BI Services ● Report creation using virtualization on Power BI to support customers ● Data modeling, Calculations, Conversions, Scheduling Data refreshes in Power BI Desktop and Power BI Services ● Hands on DAX expressions and ability to perform complex calculations on Power BI. ● Experience working with SQL databases ● Must have worked with connectors to various data sources. ● Expert knowledge of Microsoft Power BI Desktop and Power BI Services. ● Expert knowledge of Microsoft Power BI Visuals, Data modeling, and transformation techniques. ● Excellent analytical skills, including the ability to perform research that involves interpretation and analysis from a variety of sources, including sources and/or data that should be developed.
  • Job Description: Job Description: Responsibilities : Create the high level architecture and data solution design for projects (conceptual model, integration model, sourcing) in alignment with the portfolio and enterprise data strategy. Translate enterprise or business needs into long-term information architecture solutions. Apply enterprise vision to all portfolio, technical and application projects understanding and communicating different architectural models/strategies that are consistent with business strategies. Analyze business processes, operational applications and source data to understand dependencies, anomalies and implicit business rules that impact the ability to manage the master data. Design, develop and maintain logical and physical data models. Perform data profiling and analysis, and develop data models that correspond to the business and technical requirements. Review and sign-off on all project data models. Provide governance oversight to ensure project adherence to information architecture strategy, principles, standards, policies and procedures throughout all project phases. Partner with other architects to ensure alignment and integration across portfolio boundaries and promote an enterprise focus on data management. Review information architecture deliverables throughout development process to ensure quality and traceability to requirements and adherence to all plans and standards Required Qualifications : Bachelors degree or equivalent in Engineering, Computer Science, Mathematics, or related technical field; or related work experience. 5-7 years of relevant experience required Develop and maintain the portfolio information architecture strategy including data models, data strategy (e.g. management, integration and optimization), and apply to ongoing projects and initiatives, acting as the subject methodology Applied experience and expertise in data architecture concepts (e.g. model management, meta-data management, data governance) Experience delivering concurrent, large information management projects at all phases-from project Work with project managers and across the architecture organization to develop the approach and cost definition through design and delivery Demonstrated proficiency in requirements gathering and analysis processes and methodologies. Experience developing a data strategy and working within and influencing the associated policies. 3+ years software development and/or technical support experience. Preferred Qualifications Effective communication skills (written and verbal) to effectively interact with all levels of technology and business partners Financial Services industry specific knowledge that includes working knowledge of industry data standards and architectures
  • Job Description: Job Description: Sr. GCP Developer 10-12 YRS As an Sr. GCP Developer, you will be responsible for designing the solution on GCP. You will contribute in pre-sales and design, implement scalable and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight. Here's how you'll contribute: Sr. GCP Data Developer at Zensar participates in end to end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by: • Understanding customer requirements and create technical proposition • Managing and owning all aspects of technical development and delivery • Understanding requirements and writing technical documents • Ensuring code review and developing best practises • Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery • Estimating effort, identifying risk and providing technical support whenever needed • Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements • Mentoring teams as needed Skills required to contribute: 10-12+ Years of overall Development experience with - 1. 6+ Experience with Google Cloud Platform (GCP) products including BigQuery, Cloud Storage, Cloud Functions, DataProc, DataStudio. 2. Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM roles/bindings, logging explorer, troubleshooting the issues. 3. Understanding of CI/CD pipeline, Terraform scripting for deploying the objects, IAM bindings. 4. Knowledge of having data modelling (Erwin) and governance, Objects review and best practices. 5. Good knowledge of Data Warehouse concepts, ETL pipelines including Informatica/Talend, IICS, any RDBMS (nice to have Teradata) 6. Excellent communication and presentation skills. 7. Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc. 8. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler 9. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. 10. Good experience in designing & delivering data analytics solutions using GCP Cloud native services. 11. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design 12. Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. 13. Experienced in internal as well as external stakeholder management 14. Professional Google Cloud Data engineer certification will be added advantage 15. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Hive etc.
  • Job Description: Job Description: GCP Developer 8 YRS As an GCP Developer, you will be responsible for designing the solution on GCP. You will contribute in pre-sales and design, implement scalable and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight. Here's how you'll contribute: GCP Data Developer at Zensar participates in end to end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by: • Understanding customer requirements and create technical proposition • Managing and owning all aspects of technical development and delivery • Understanding requirements and writing technical documents • Ensuring code review and developing best practises • Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery • Estimating effort, identifying risk and providing technical support whenever needed • Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements • Mentoring teams as needed Skills required to contribute: 8+ Years of overall Development experience with - 1. 5+ Experience with Google Cloud Platform (GCP) products including BigQuery, Cloud Storage, Cloud Functions, DataProc, DataStudio. 2. Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM roles/bindings, logging explorer, troubleshooting the issues. 3. Understanding of CI/CD pipeline, Terraform scripting for deploying the objects, IAM bindings. 4. Knowledge of having data modelling (Erwin) and governance, Objects review and best practices. 5. Good knowledge of Data Warehouse concepts, ETL pipelines including Informatica/Talend, IICS, any RDBMS (nice to have Teradata) 6. Excellent communication and presentation skills. 7. Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc. 8. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler 9. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. 10. Good experience in designing & delivering data analytics solutions using GCP Cloud native services. 11. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design 12. Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. 13. Experienced in internal as well as external stakeholder management 14. Professional Google Cloud Data engineer certification will be added advantage 15. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Hive etc.
  • Job Description: Job Description: GCP Lead 12-15 YRS As GCP Lead, you will be responsible for Supervising D2P Activity (Development/Code Review/Test Phases/Implementation Plan) solution on GCP. You will contribute in pre-sales and design, implement scalable and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight. Here's how you'll contribute: GCP Lead at Zensar participates in end-to-end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by: • Understanding customer requirements and create technical proposition • Managing and owning all aspects of technical development and delivery • Contributing to SoWs, technical project roadmaps, etc required for successful execution of projects leveraging Technical Scoping & Solutioning approach • Provide technical leadership and be a role model/coach to software engineers pursuing technical career path in engineering • Ensuring code review and developing best practices • Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery • Estimating effort, identifying risk and providing technical support whenever needed • Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements • Provide regular updates & guidance to leadership team regarding status, risks etc. on time. • Leading & Mentoring teams as needed • Understanding technical requirement and take part in technical discussion. • Purpose and plan technical solution accordingly (Design, Development, and Implementation) • Defining and dividing tasks based on the requirement. • Take part in and host regular knowledge sharing sessions, mentor more junior members of the team and support the continuous development of our practice.  Skills required to contribute: 12-15+ Years of overall IT experience with - 1. 6+ Experience with Google Cloud Platform (GCP) products including BigQuery, Cloud Storage, Cloud Functions, DataProc, DataStudio. 2. Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM roles/bindings, logging explorer, troubleshooting the issues. 3. Understanding of CI/CD pipeline, Terraform scripting for deploying the objects, IAM bindings. 4. Knowledge of having data modelling (Erwin) and governance, Objects review and best practices. 5. Good knowledge of Data Warehouse concepts, ETL pipelines including Informatica/Talend, IICS, any RDBMS (nice to have Teradata) 6. Excellent communication and presentation skills. 7. Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc. 8. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler 9. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. 10. Good experience in designing & delivering data analytics solutions using GCP Cloud native services. 11. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design 12. Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. 13. Experienced in internal as well as external stakeholder management 14. Professional Google Cloud Data engineer certification will be added advantage 15. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Hive etc.
  • Job Description: GCP ARCHITECT 12-15 YRS As an GCP Data Architect, you will be responsible for providing advisory and thought leadership on the migration to Google Data Cloud or implementing data solutions using Google Data Cloud services including integration with existing data and analytics platforms and tools. You will contribute in pre-sales and design, implement scalable data architectures, and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight. Here's how you'll contribute: GCP Data Architect at Zensar participates in end to end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by: • Responding to client RFI, RFP documents with proper solution design including cost estimates • Understanding customer requirements and create technical proposition • Contributing to SoWs, technical project roadmaps, etc required for successful execution of projects leveraging Technical Scoping & Solutioning approach • Managing and owning all aspects of technical development and delivery • Understanding requirements and writing technical documents • Ensuring code review and developing best practises • Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery • Estimating effort, identifying risk and providing technical support whenever needed • Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements • Leading and mentoring teams as needed Skills required to contribute: 12-15 Years of Data and Analytics experience with minimum 5 years in Google Data Cloud native services. 1. Excellent communication and presentation skills. 1. Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc. 1. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler 1. Analyze, re-architect and re-platform on-premise data warehouses to data platforms on GCP cloud using native or 3rd party services. 1. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. 1. Good experience in designing & delivering data analytics solutions using GCP Cloud native services. 1. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design 1. Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. 1. Experienced in internal as well as external stakeholder management 1. Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. 1. Professional Google Cloud Data engineer certification will be added advantage 1. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Hive etc.
  • Job Description: Title: AWS/Python Developer
  • Job Description: Job Description: Support Analyst with AEM & Data-OT Exp 5+ Years of experience in Content management with AEM. · AEM Functional knowledge · XML & Data-OT experience · AEM - WEM Support Experience. · Content authoring & publishing experience. · Agile & Scrum team experience. Domain Exp: Content Management Systems Tools: AEM (Functional Knowledge)
  • Job Description: Google Big Query, SQL, Java
  • Job Description: Linux, Bash, Docker, Jenkins. Can deploy and run/ troubleshoot running systems using GoogleOperations. Cloud Run, GKE. Author build and deployment pipelines targeting Google Cloud.
  • Job Description: Java,Maven, google PubSub, Jdbc, JMS, Springboot or similar
  • Job Description: Job Description: Must have experience with Azure data factory, Experience In Implementing Data platform using Python, Azure data lake, Snowflake
  • Job Description: Job Description: • Establishment of a documented methodology, including processes, templates, tools and best practices, for SFDC solution and service offerings • Provide strong analytical, problem solving, technical, and project management skills to effectively collect requirements, evaluate options, and design and deliver quality solutions. • Contribute new products, workflows, and ideas to influence the SDLC platform • Identify risks that threaten project success and recommend workarounds • Ensure Salesforce runs smoothly by testing and adjusting as needed, then documenting fixes • Develop and test technology solutions geared towards the improvement of tasks as they are completed today within the organization • Create various sub-methodologies within the overall delivery framework to accommodate waterfall, Agile and hybrid approaches to project implementations • Develop reusable repository of tools, work objects, best practices, project delivery standards, thought leadership and frameworks for efficient execution of engagements which drive business value • Collaborate with product managers from other teams to develop and execute CRM solution strategy • Drive technical and operational excellence in all project aspects, such as requirements gathering, gap analysis, solution design, implementation, and release management • Provide project updates on a consistent basis to various stakeholders about strategy, adjustments, and progress • Manage contracts with vendors and suppliers by assigning tasks and communicating expected deliverables • Ability to convey complex outcomes through concise data driven tools Skills required to contribute: 1. Bachelor's degree in computer science, business, or a related field 2. Overall 8+ years of IT analyst experience including 5+ years’ experience as Salesforce IT Analyst 3. Exceptional knowledge of various Salesforce modules and CRM business processes 4. 3+ years’ experience in product management and various SDLC methodologies 5. Ability to define and document business processes and workflows 6. Experience seeing projects through the full life cycle 7. Strong interpersonal skills and extremely resourceful 8. Has knowledge in Salesforce.com and Force.com platform
  • Job Description: Job Description : Details: · 10+ years of experience in leading the design and development of data and analytics projects in a global company. · Experience in working for projects across cross functional teams, building sustainable processes and coordinating release schedules. · 8+ years of MS SQL experience with data modeling. · 2+ years of experience in working with Microsoft Azure and strong knowledge about ADLS, Blob Storage, Data Factory, SQL Server and Warehouses. · Experience in Cloud based EDW platforms (Snowflake, Synapse, Redshift, Big Query etc). · 5+ years of building and launching new data models that provide intuitive analytics for the analysts and customers. · Excellent communication Skills.
  • Job Description: Job Description : Here's how you'll contribute: • Be a technical expert on all aspects of Snowflake • Deploy Snowflake following best practices • Develop complex data models in Snowflake • Tune Snowflake for performance and optimize utilization • Monitoring health, tuning, and growth of such on premise and cloud databases • Ensures security and maintenance of databases • Manage and monitor user access to the databases • Diagnose and troubleshoot database errors • Maintains schema and objects • Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology • Maintain deep understanding of complementary technologies and help organizations leverage Snowflake as part of their larger technology stack • Provide guidance on how to resolve customer-specific technical challenges • Build, design, architect and implement high-volume, high-scale data analytics and machine learning Snowflake solutions in the cloud. • Provide technology leadership in driving large complex Snowflake implementations • Design and develop features, understand customer requirements, and meet business goals. • Implement, monitor & maintain complex Snowflake solutions • Migrating Data from On Prem to Cloud having minimum of 1-year Full Product Lifecycle experience • Worked on Data Warehouse migration project of Snowflake • Should have knowledge of on how data get processed and consumed in snowflake and how to automate the completed process. Required Skills: • Bachelor’s degree in computer science, or related field. • Minimum 7 years of experience in Database Administration required. • Experience with SQL replication and SQL Server Agent • Experience with MS-SQL software installation, upgrade, patching • Experience with Data Warehouses and Data Marts • Snowflake certified, good to have • Prior experience of managing and set up of medium to large installations Snowflake • Experience with Lean / Agile development methodologies • Demonstrated proficiency of index design, query plan optimization and analysis required • Demonstrated knowledge of MS-Windows environments • Willingness to learn new skills and new knowledge • Experience in managing high transactional large databases • Ability to work well both independently and with other team members • Ability to multi-task required and provide rapid support in production
  • Job Description: Job Description : Responsibilities : Create the high level architecture and data solution design for projects (conceptual model, integration model, sourcing) in alignment with the portfolio and enterprise data strategy. Translate enterprise or business needs into long-term information architecture solutions. Apply enterprise vision to all portfolio, technical and application projects understanding and communicating different architectural models/strategies that are consistent with business strategies. Analyze business processes, operational applications and source data to understand dependencies, anomalies and implicit business rules that impact the ability to manage the master data. Design, develop and maintain logical and physical data models. Perform data profiling and analysis, and develop data models that correspond to the business and technical requirements. Review and sign-off on all project data models. Provide governance oversight to ensure project adherence to information architecture strategy, principles, standards, policies and procedures throughout all project phases. Partner with other architects to ensure alignment and integration across portfolio boundaries and promote an enterprise focus on data management. Review information architecture deliverables throughout development process to ensure quality and traceability to requirements and adherence to all plans and standards Required Qualifications : • Bachelors degree or equivalent in Engineering, Computer Science, Mathematics, or related technical field; or related work experience. • 5-7 years of relevant experience required • Develop and maintain the portfolio information architecture strategy including data models, data strategy (e.g. management, integration and optimization), and apply to ongoing projects and initiatives, acting as the subject methodology • Applied experience and expertise in data architecture concepts (e.g. model management, meta-data management, data governance) • Experience delivering concurrent, large information management projects at all phases-from project • Work with project managers and across the architecture organization to develop the approach and cost definition through design and delivery • Demonstrated proficiency in requirements gathering and analysis processes and methodologies. • Experience developing a data strategy and working within and influencing the associated policies. • 3+ years software development and/or technical support experience. Preferred Qualifications : • Effective communication skills (written and verbal) to effectively interact with all levels of technology and business partners • Financial Services industry specific knowledge that includes working knowledge of industry data standards and architectures
  • Job Description: Need to be proficient in SQL and Python Understanding/Familiarity of Snowflake or GCP Run data pipeline in snowflake + python data integration Design and implement ETL data pipelines integrating various data sources in Python Create Python modules, SQL scripts, indexes and complex queries for data analysis, extraction of data sources and to automate manual workloads involved in the ETL pipeline Good to have – knowledge of Salesforce Marketing Cloud
  • Job Description: Job Description : Need 10 Yrs Experience • Power BI/ Tableau Report development. • Building Analysis Services reporting models. • Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. • Connecting data sources, importing data, and transforming data for Business intelligence. • Analytical thinking for translating data into informative reports and visuals. • Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI. • Should have an edge over making DAX queries in Power BI desktop. • Expert in using advanced-level calculations on the data set. • Responsible for design methodology and project documentaries. • Should be able to develop tabular and multidimensional models that are compatible with data warehouse standards. • Very good communication skills must be able to discuss the requirements effectively with the client teams, and with internal teams Skills & Requirements - • Proven working experience as a data analyst or business data analyst and have good knowledge of “Power BI” • Technical expertise regarding data models, database design development, data mining and segmentation techniques • Strong knowledge of and experience with reporting packages “Snowflake,Hadoop”, ETL frameworks. • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy • Adept at queries, report writing and presenting findings
  • Job Description: 5 +years of experience in software development. Technical experience in Python is required. Excellent verbal communication and documentation skills Ability to provide recommendations for the e2e architecture infrastructure An understanding of application performance tuning and resource usage Ability to troubleshoot existing scripts and do point bug fixes / application redesign Working knowledge with MongoDB, JSON, sync/async architecture You collaborate with other engineers and people in the organization to improve our solution delivery You believe in Lean/Agile principles and actively work to incorporate them in the organization
  • Job Description: Responsibilities: · Create overall internal enterprise systems strategy based on business goals and direction and in alignment with overarching IT strategy, with supporting roadmap accounting for technology trends and regulatory requirements. Articulate alternative options and solicit approval from key business & finance stakeholders, including governance boards and Executive Committee. · Create and regularly monitor measures within areas of accountability (for enterprise applications team) · Develop an architectural roadmap and reference architecture for all technologies across the enterprise, as well as roadmaps for rationalization and achievement of architectural objectives for all software systems across the enterprise · Develop toolsets, software development operations and automation supporting the DevOps needs of the enterprise · Work with senior leader peers to drive a coordinated program of architectural exploration and innovation, constantly monitoring developments in the market and seeking out disruptive technologies/vendors · Define and enforce architecture governance in all software systems and infrastructure programs · Provide cost effective Application and infrastructure solutions and analysis in support of development and implementation of business applications (evaluation, analysis, requirements definitions, development, implementation, and training). · Develop and maintain a private/hybrid cloud application architecture solution based on a situational awareness of various business scenarios and motivations. · Participate in demand management and development lifecycle including proof-of-concept development, supporting operationalization of concepts, solution definition and sizing, and overall implementation of the architectural roadmap · Identify risks to goals and objectives and create mitigation plans in collaboration with stakeholders. · Communicate relevant and timely information to Senior stakeholders for purposes of governance and risk mitigation · Oversee Financial budgets and forecasts and remain within commitments while achieving goals; Understand the progression of the budget vs commitments · Review & adjust budgets developed / recommended by direct report departments and authorize expenditures · Improve productivity at best possible cost · Understanding business processes associated with enterprise applications, product lifecycle management, and other enterprise applications, recommending solutions to problems and to enable achievement of business goals · Implementing and ensuring best leverage of new and existing enterprise application technology across diverse business areas/processes · Provide resources and subject matter expertise for reliable execution of project delivery (on time, on budget, on quality), in cooperation with the PMO, (leveraging tools, methodologies, and resources provided by the PMO) · Communicate dependencies & required resources in other areas to successfully execute programs (e.g. business, IT, data & analytics, etc) · Maintain a reliable core applications and related interfaces · Work with peers to appropriately coordinate and communicate activities & mitigate risks in alignment with overall corporate and IT strategic intent Qualifications: Education: Bachelor’s degree (B.A./B.S.) in computer science, engineering, business administration or related discipline. Experience & Skills: · Minimum of twelve (12) years application delivery & maintenance experience, preferably in medical device, life sciences, or related industry · Effective communication skills for large, multi-national audiences · Experience with Solution Architecture, Application and Data Architecture · Understanding of Enterprise Architecture Framework · Consulting experience preferred · Corporate Applications experience Salesforce, Oracle EBS, ServiceNow preferred. · Experience in Vendor evaluation, on-prem and SaaS Licensing models · RPA, Chat Bot/AI, Splunk, Aspera, VDI knowledge and experience preferred · Management acumen – Solid understanding and demonstrated ability to: · Manage and work with a high performing team · Lead amongst business partners & customers to drive desired outcomes · Business acumen – Experience managing budgets, Vendor Management, understanding of business processes & customer value · Operational Performance – Understanding of and/or experience with operational excellence theories and methodologies · Experience and/or understanding of Agile and/or Project Management Methodologies
  • Job Description: Must-Have: • Written and verbal communication skills • Capacity to manage high-stress situations • Ability to multi-task and manage various project elements simultaneously • Lead the team towards Big-picture and vision • Attention to detail & Conflict resolution skills JD: • Ability to oversee and manage large, complex, diverse, and strategic projects that impact the organization as a whole • Coordinate & communicate with all areas of the enterprise that impacts the scope, budget, risk, and resources work efforts • Ability to assemble project plans and teamwork assignments, directing and monitoring work efforts daily, identifying resource needs, performing quality reviews; and escalating functional, quality, timeline issues appropriately • Project Management certification or successful completion of a recognized project management curriculum is preferred.
  • Job Description: Job description – Provide software development and project management, cross-functional coordination, and inter/intra team communications to deliver outstanding program/Project outcomes. Work closely with Software Engineers, Quality Analyst, Product Managers and other engineering teams to get high-quality products and features through agile practices Manage project schedules, identify possible issues, and clearly communicate them to project stakeholders. Take responsibility for release schedules and milestones, keeping up a high velocity in a fast-paced environment. Manage scrum calls and agile ceremonies
  • Job Description: 8+ years of experience in Java/J2EE development. Experience/Skills: Core Java, Micro services Spring, Spring boot, Web Services Good to have No SQL DB, Elastic Search / Kafka exposure
  • Job Description: Basic Qualifications: Bachelor s degree or Masters degree Minimum of 5+ years of consulting experience with client impact and value creation Minimum of 5+ years of experience with BI Data platforms and technologies, and master data management solutions with experience in following Teradata, SAP HANA, Oracle & Snowflake, SQLs, PL/SQLs, Informatica. Minimum of 3 years of experience with systems & data architecture modeling tools and techniques, using standard architecture frameworks • Evaluating business processes, anticipating requirements, uncovering areas for improvement, and developing and implementing solutions. • Leading ongoing reviews of business processes and developing optimization strategies. • Staying up-to-date on the latest process and IT advancements to automate and modernize systems. • Conducting meetings and presentations to share ideas and findings. • Performing requirements analysis. • Documenting and communicating the results of your efforts. • Effectively communicating your insights and plans to cross-functional team members and management. • Gathering critical information from meetings with various stakeholders and producing useful reports. • Working closely with clients, technicians, and managerial staff. • Providing leadership, training, coaching, and guidance to junior staff. • Allocating resources and maintaining cost efficiency. • Ensuring solutions meet business needs and requirements. • Performing user acceptance testing. • Managing projects, developing project plans, and monitoring performance. • Updating, implementing, and maintaining procedures. • Prioritizing initiatives based on business needs and requirements. • Serving as a liaison between stakeholders and users. • Managing competing resources and priorities. • Monitoring deliverables and ensuring timely completion of projects.
  • Job Description: Must have Cisco Data Experience like IB Data, Service contract Management, Auto Renewals, Services Domain, Booking, Quotes, Digital Case Services, etc. data knowledge. CCW, Commerce Quote to Order/Q2C processes, CCW-R knowledge is a must Project Management and Stake holder management is must.
  • Job Description: Work with ML engineering team to deploy and monitor models Good working knowledge of NLP/NLU paradigms , models and approach Help with designing data pipelines for model development and inferencing Create and maintain confluence page around ML Lifecycle management practices by working with Data Scientists and ML Architect Skills Deep Experience in Spark, Python Experience with AWS compute, ML tool kit – Sagemaker, Glue, Brew. Redshift Experience in conversational analytics and configuring NLP tools
  • Job Description: • Power BI Report development. • Building Analysis Services reporting models. • Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. • Connecting data sources, importing data, and transforming data for Business intelligence. • Analytical thinking for translating data into informative reports and visuals. • Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI. • Should have an edge over making DAX queries in Power BI desktop. • Expert in using advanced-level calculations on the data set. • Responsible for design methodology and project documentaries. • Should be able to develop tabular and multidimensional models that are compatible with data warehouse standards. • Very good communication skills must be able to discuss the requirements effectively with the client teams, and with internal teams Skills & Requirements - • Proven working experience as a data analyst or business data analyst and have good knowledge of “Power BI” • Technical expertise regarding data models, database design development, data mining and segmentation techniques • Strong knowledge of and experience with reporting packages “Snowflake,Hadoop”, ETL frameworks. • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy • Adept at queries, report writing and presenting findings
Ready To Join Us..