One stop solution to your remote job hunt!
By signing up you get access to highly customizable remote jobs newsletter, An app which helps you in your job hunt by providing you all the necessary tools.
About Pensa Systems
Pensa is a group of innovators driven to help CPG (Consumer Packaged Goods) brands and retailers grow by providing them the source of truth for the retail shelf. Our computer vision and patented artificial intelligence and machine learning is the world's first fully automatic capture-to-insight retail shelf intelligence solution that learns as it goes. We provide our syndicated SaaS solution to top global CPG brands and retailers such as Johnson & Johnson, General Mills, Anheuser-Busch InBev, CircleK, and Unilever to address a trillion-dollar industry blind spot by digitizing physical inventory and bringing it online. Although we work hard and are accountable to our customers and each other, we don't get too wrapped up in protocol or bureaucracy. We are a erse group whose collaboration and open communication helps us all do our best work and succeed as a team. If you are looking to grow personally and professionally while making a difference with something new and innovative, Pensa may be the right place for you.
If you would like to learn more about us, please visit pensasystems.com.
Job Description
Our software team is responsible for all of the functionality associated with acquiring, processing, and reporting the state of retail shelves. This spans mobile applications, machine learning, web applications, and data analytics. We work at large scale deployments over multiple cloud vendors with a system that has high uptime and throughput requirements.
The Data Platform Lead is a key member of the engineering team who sits at the intersection of product development and the overall technology vision. They are responsible for providing leadership from an enterprise-wide perspective in the definition, design, review, selection, communication, and implementation of our data architectural direction to most effectively realize our value, vision and strategies.
As the Data Platform lead, you will define strategy and drive the Data Platform roadmap, help build it out, and work with a team of engineers to deliver it.
In this role, you will
- Lead critical data platform architecture decisions across the whole Pensa Data Platform
- Lead the vision and design of architectural roadmaps and data technology architectures
- Provide expertise on the overall data engineering best practices, standards, architectural approaches and complex technical resolutions
- Strategize and lead delivery of complex client engagements by designing Pensa Data Delivery solutions:
- Understand client goals and objectives
- Assess their current state and identify gaps
- Develop high-priority use cases
We'd love to hear from you if you have
- 10+ years of experience in a Data Platform Architect or Solution Architect role in Data Engineering/Analytics-related businesses with various infrastructure technologies.
- Experience with large scale systems
- Experience with technologies like Python, Airflow, distributed and streaming technologies (Kafka, Kinesis), ETL tools, CI/CD, Terraform, data modeling for transactional and reporting focuses, and Data Warehousing solutions.
- Experience building data-intensive applications in at least one cloud environment
- Ability to define, identify and resolve problems with varying degrees of complexity.
Passionate about Microsoft technologies? Looking for your next big challenge? You've come to the right place. We are currently on the lookout for a Database Developer to join our growing team and work with us on an interesting large-scale FinTech project. The fun side: the projects is fully immersed in a wide variety of Microsoft technologies (.NET, Azure, MSSQL, Data Factory, Dynamics...)
Our projects are quite challenging but don't worry - you will be working with a team of passionate, friendly and inquisitive self-starters, who like to follow new technology trends and constantly improve themselves as well as the projects they're working on.
Your role and responsibilities
As a Database developer you would be working on a FinTech project that utilises virtual payments, cards and wallets for real time funding while tracking and analysing those payments for a behavior driven brand building system. You would be responsible for developing, testing, improving and maintaining new and existing databases to help its users retrieve data effectively. To be more precise, you will be:
- Designing, creating and supporting stable, reliable and effective databases that can withstand various attacks and loss of information
- Modifying various databases according to user requests
- Involved in schema design, code reviews and SQL query tuning
- Using T-SQL to develop stored procedures, functions, triggers and views
- Working together with developers to improve and optimize the performance of different applications
- Installing, tuning, maintaining and upgrading different DBMS solutions
- Research, suggest and own the process of implementing new solutions and technologies
- Writing technical documentation and provide occasional support to different teams
- Testing, troubleshooting and solving occasional database issues and malfunctions
- Creating reports per user request
- Providing data management support to different teams
- Working with distributed teams in an Agile oriented environment
About you
You're someone with 2+ years of experience working with databases and were responsible for database development and management/administration on your previous projects. You are proficient in everything data related, from inception to execution, maintenance and upgrade. Tackling both relational (MySQL, SQL server, PostgreSQL...) and non-relational databases (MongoDB, Redis, Cassandra, HBase) is an average day to you. Someone said queries? Well, that's not an issue for you since you can write an SQL query to fetch you almost anything. Basics of SQL Server administration is also nothing new to you (users, permission, backup, recovery, monitoring...). And since you're analytical you know your way around creating different reports (SSRS, SSAS). Sounds like you? Great, head on to the next section.
What next?
If you're ready to be a part of a team that works together to achieve both technical and personal greatness be sure to hit apply.
We will carefully select all the candidates for the next steps. For a detailed info on our hiring process and what to expect, be sure to check out our Careers page.
Questions?
Not sure if you're the right person for this? You need more info about the project or us? Don't worry, I'm here for you :) Be sure do drop me a message whichever way you like:
- E-mail: lejla.musovic @ klika.ba
- Phone: +387 61 907 780
- Viber: +387 61 907 780
- LinkedIn: https://www.linkedin.com/in/le...
OTA Insight is a commercial platform for the hospitality industry (our clients are AirBnb property owners, inidual hotels and major chains like Hilton, Accor, etc.). These are exciting times at OTA Insight: we recently raised another round of funding ($80 million) and we have welcomed two new companies, Transparent and Kriya, to our organisation that will expand our product portfolio! To accomplish our ambitious plans we are growing our Engineering team.
As a Data Science Team Lead, you will be a leader & mentor to a team of Data Scientists with a full range of seniority. You will be responsible for stakeholders management and driving the roadmap of Data Science research related to existing product solutions.
In this role you will be partnering with multiple different stakeholders, such as our product & design teams, and our truly inspiring team of technical experts on various subjects (DevOps, data engineering, data science, fullstack, security, …). Additionally, you will be responsible for the happiness and growth of Data Scientists in your team.
This is a high-impact role within OTA Insight which combines people and technology leadership.
Responsibilities
- Drive product innovation (in partnership with Product Management):
-
Deeply understand our product and business strategy and client needs;
-
Actively participate in the product ideation and be a critical voice in the process;
-
Own a backlog of product related data research initiatives.
-
- Be the bridge between business stakeholders and your team:
-
Coordinate between different stakeholders;
-
Gather requirements and translate them into clearly defined tasks;
-
Document & verify high-level architectural decisions related to Data Science;
-
Partner with product & engineering leadership to define strategy and vision on technology, product or team-related topics and take responsibility in defining how that impacts your team.
-
- Be a great leader:
-
Plan and prioritise work within your domain roadmap, absorbing planning complexity for team members;
-
Be a sounding board to the team and share the knowledge and expertise;
-
Review Data Science projects and take action where necessary;
-
Support team members in their personal and professional growth;
-
Make your team a better place and feel part of OTA Insight’s mission.
-
- Grow a great team:
-
Hire, develop and coach an exceptional team of Data Scientists;
-
Ensure scalability of processes and best practices in your team.
-
- 5 years working in a Data Science role:
-
Understanding the steps required in cleaning and transforming datasets;
-
Strong expertise in statistics;
-
Knowledge of standard ML techniques;
-
Ability to derive and clearly communicate insights from a dataset;
-
Experience with a modern Data Science tech stack (jupyter lab, cloud infrastructure, etc.).
-
- Proven record of management skills:
-
Comfortable delegating work through a team of both juniors and seniors;
-
Ability to coach a team, both the juniors and seniors;
-
Actively addresses problems when identified and assumes responsibility for work;
-
Able to lead, prioritise, delegate and inspire in an ambiguous, highly dynamic, and highly demanding technology environment.
-
-
Willing to acquire deep knowledge of how revenue management and distribution works within the hotel industry.
-
Pragmatic in problem solving and decision making.
-
Is a clear communicator in decisions and the rationale behind them.
-
Preferably, has successfully participated in product ideation processes.
Hiring a Senior Data Engineer as we scale our team at Ready and Broadband Money:
Ready is working to level the playing field on this century’s most important utility - making sure everyone has access to fast and reliable internet. And we’re taking a novel approach that’s picking up tailwinds. We need your expertise in database management, spatial-temporal data ETL process, and build models as we expand the customer base, volume of data, and complexity of our platform.
This is a golden opportunity to join a fast-moving tech company in an industry poised to receive billions of dollars in federal funding – imminently. This means more customers are continuously seeking more from our mapping products. As a Data Engineer at Ready, there’s an opportunity to play a meaningful role in what we’re building.
A bit about you 🥇
-
5+ years related experience (Data Engineer) (Experience in spatial-temporal database is a plus)
-
Experience in data pipeline, big data architecture, data warehouse, ETL and modern ELT process and automations, and data governance
-
Solid experience in frontend OR backend development (Full stack is a plus)
-
Experience in data preparation, data analysis and building prediction models with open source tools: R, Python, etc.
-
You’re organized, thoughtful, creative, and rigorous
-
You’re humble, honest, and scrappy
-
Able to work efficiently and independently; proactive communicator in a distributed and often asynchronous environment
-
You enjoy building lean, inclusive, and performant teams and mentoring junior engineers
About your role at Ready ⚡️
-
Design, develop, and implement data pipeline to process incoming data in a scalable manner
-
Manage and maintain existing database
-
Automate existing processes, such as data backups, security checks, alerts and disaster recovery, to streamline the workflow
-
Develop API for data administration and serving data to other parts of the product
-
Work with customers to understand their problems. Work with your teammates to devise solutions to those problems.
-
You’ll have a major impact on your team and company here at Ready.
-
This role could become a leadership position within Ready as we continue growing together.
About Ready 🚀
-
Creative problem solvers approaching an antiquated system with a revolutionary viewpoint
-
Humble but ambitious, knowledgeable but curious, persistent but not obnoxious
-
Concise and effective in written and spoken communication
-
Comfortable working remotely
About what you get…
-
Competitive salary plus meaningful equity upside
-
Competitive (and ever expanding) benefits for employees and dependents
-
Opportunities to learn and grow – all things startups
-
A chance to play a role in defining the roadmap as we pursue a bold vision and and a big goal
-
Work from anywhere you want, as long as you can get great internet (and your work here at Ready helps make this true in more places).
-
To get away - we all convene 2x / year for [optional] retreats
-
We’re actively shaping our benefits program: have a say in which benefits matter to you
-
The charter to a build product in a market that is set to receive $65 billion in grant funding across the United States
QS is a global leader in higher education services. We are the data analytics, performance insight provider and intelligence partner supporting university excellence across the world. Our student recruitment and enrolment solutions enable universities and business schools to connect with talented iniduals that are seeking to further their academic progress and career development. Our portfolio of professional services includes consultancy, student mobility and academic partnerships management, and branding solutions.
We publish highly visible and influential rankings of international universities, including the QS World University Rankings® which reaches a global audience of hundreds of millions of people. QS keeps growing organically, through acquisition and by recruiting top talent across key regions.
< class="h3">Position Summary:The Senior Data Analyst role for QS Enrolment Solutions will analyse large volumes of complex data to produce insights that can improve business performance. They will help identify stakeholder requirements and develop insightful dashboards and reports to support business decisions and provide customer insights.
Working collaboratively with global stakeholders the Senior Data Analyst will identify and adopt best practices to solve complex problems in a data- driven manner and use strong analytical skills and business acumen to support leadership in influencing critical business decisions.
The scope of this role will be erse, from ad-hoc pieces of analysis, to ensuring timely regular reporting, enhancing existing reporting pipelines, and working on long-term strategic projects to enhance the BI & Analytics.
This role will be required to collaborate with internal stakeholders to understand their data requirements and analyse outcome, as necessary.
< class="h3">Responsibilities:- Data analysis, using statistical techniques, identify, analyse, interpret trends, patterns in new and existing datasets to present insights.
- Guide and mentor the junior data analysts in the team
- Be responsible defining standardised metrics, methodologies, and models to ensure consistency and accuracy across all analytical and reporting activities
- Make recommendations about the methods and processes in which QSES obtains and analyses data to improve accuracy and the efficiency of data processes
- Collaborate with internal stakeholders to identify, prioritise and implement new data sources and analytical needs of the business.
- Compile reports and provide insights to internal stakeholders to measure and improve business performance.
- Specification and documentation of new reporting requirements.
- Benchmarking and trend analysis against performance targets.
- Write and execute SQL queries to extract and manipulate data for the purpose of analysis and reporting.
- Testing and documentation of data models.
- Check and verify data and analyses to ensure accurate reporting
- Identify and resolve the root cause of data quality issues and develop solutions to improve data quality.
- Support cross-functional teams on day-to-day execution and support ad-hoc requests
- Perform data operations activities such as matching, formatting and mapping data to ensure a global common data set.
- Promote 'Self-serve' analytics by mentoring internal stakeholders in the interpretation and use of analytical reports and models.
- Assist the Director of Business Intelligence with any other duties as required
- BS in Mathematics, Economics, Computer Science, Information Management or Statistics, or a closely related field or equivalent demonstrable experience.
- 5+ Years experience working in Data Analyst or Data Science role.
- Strong analytical skills with the ability to collect, organise, analyse, and disseminate
- significant amounts of information with attention to detail and accuracy.
- 3+ years Experience in formatting, analysing and presenting data in Tableau, PowerBl or other BI reporting applications.
- Experience in mentoring a team of analysts.
- Ability to write complex queries in SQL and model data from data processes.
- Experience in maintaining or conceptualising analytical models using R or Python.
- Technical expertise regarding data models, data mining and segmentation techniques
- Experience of working within a RDMS and Data Warehouse environment.
- Highly developed critical thinking and problem-solving skills with an ability to conceptualise and think creatively.
- Experience of working in a small, high impact teams required to meet tight deadlines.
- Excellent communications skills -- both written and verbal, and the ability elicit reporting an and analytical requirements and to present difficult concepts into understandable terms to all levels of the organisation.
- Able to work autonomously and within a team including managing own workload
- Proačtive attitude with the ability use initiative to identify potential issues and offer creative solutions.
- Knowledge of Agile and Waterfall methodologies
- Excellent time management and organisational skills with and aptitude to work under pressure
- Willingness to offer and discuss opinions whilst considering those of others in the team and the wider business
- Competitive package.
- Flexible working.
- Vibrant social environment and multicultural, multinational culture, strong team spirit.
- Focus on welfare – ride to work scheme, global wellness team, Calm app, EAP and health plan, mental health first aiders, ersity and inclusion initiatives.
- Strong recognition and reward programs – peer recognition platform, quarterly and annual awards, annual bonus scheme.
- Support for volunteering and study leave.
QS Quacquarelli Symonds is proud to be a fair and equal organization where everyone has the same opportunity to achieve their full potential, irrespective of their background or personal attributes. We celebrate our ersity and believe through sharing our experiences we can learn from one another, be stronger together, and enable our business to thrive.
Mind Computing is seeking a Senior Development Data Security Operations (DDSO) SME to join our growing team in support of the Department of Veteran Affairs, specifically the Financial Services Center (FSC). The FSC serves customers internally at the VA and other government agencies (OGA).
The candidate must reside within the continental US.
Responsibilities:
- Serve as an advisor to leadership concerning the development, design, maintenance, and implementation of enterprise-level Cloud systems.
- Produce architecture diagrams, and blueprints, and bring solutions to reality with a DevSecOps mindset & culture.
- Provide technical guidance and foster a collective understanding of data flows and security issues encountered in cloud applications and services.
- Work in Agile methodology and partner with scrum teams for secure application and/or infrastructure solution architecture.
- Deliver Cloud Security Architecture/DevOps on assigned projects using any Cloud Service Provider (CSP).
- Manage the operation and advancement of the CI/CD pipeline.
- Support the government PMO management to lead the DDSO Center of Excellence and meet Center of Excellence requirements.
Requirements:
- Bachelor's Degree in Engineering or Technical related discipline.
- 8 years of experience working functionally across financial services organizations
- Experience and understanding of Infrastructure as Code, Automation, and Orchestration.
- Experience deploying web and service-based applications in Windows/Linux environments.
- 2 or more years of experience doing application and/or infrastructure solution architecture for Azure or AWS equivalent products and services.
- Knowledge and experience across IT infrastructure with security frameworks and standards such as ISO 27001, NIST, and other relevant security-related regulations.
- ISC or CISSP certification.
Additional Qualifications
- Experience in the VA
- Ability to obtain a government clearance.
Benefits:
- Medical/Dental/Vision
- Corporate Laptop
- PTO + Federal Holidays + Sick Leave
- Training opportunities
- Remote work
Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, or protected veteran status.
Sorry, we are unable to offer sponsorship at this time.
We are looking for a talented Senior Data Scientist to join our Data team and help us change the world of digital advertising together.
WHO WE ARE
At Seedtag our goal is to lead the change in the advertising industry, because we believe that effective advertising should not be at odds with users' privacy.
By combining Natural Language Processing and Computer Vision our proprietary, Machine Learning-based technology provides a human-like understanding of the content of the web that finds the best context for each ad while providing unparalleled risk-mitigation capabilities that protect advertisers from showing their ads on pages that could be damaging for their brand. All of this, without relying on cookies or any other tracking mechanisms.
Every day, our teams develop new services that reach over 200 million users worldwide with fast response times to ensure that we deliver the best user experience. We're fully committed to the DevOps culture, where we provide the platform that our Software Developers and Data Scientists use to manage over 100 different microservices, pushing dozens of changes to production every day. All of this is built on top of Kubernetes in Google Cloud Platform and Amazon Web Services.
If you are interested in joining one of the fastest growing startups in Europe and work on massive scalability challenges, this is the place for you.
KEY FIGURES
2014 · Founded by two ex-Googlers
2018 · 16M total turnover & Internationalization & Getting growth
2021 · Fundraising round of 40M€ & +10 countries & +230 Seedtaggers
2022 · Fundraising round of 250M€ + expansion into the U.S market + 400 Seedtagers
YOUR CHALLENGE
- You will work identifying, exploring, and making sense of data sources.
- You will partner closely with the business teams to assess how data can provide value in their units.
- You will develop machine learning solutions to solve business problems: anomaly detection, price optimization, etc.
- You will develop and deploy production-oriented data software.
YOU WILL SUCCEED IN THIS ROLE IF
- You have 2-4 years of solid experience in data science and machine learning
- You (preferentially) have a degree in computer science, engineering, statistics, mathematics, physics or another degree with a strong quantitative component
- You understand the theoretical functioning of the main algorithms in ML and you have experience applying these to real problems
- You have ample experience with one or more of these machine learning tools: Scikit-learn, Tensorflow, PyTorch, etc.
- You are a proactive person who likes the startup work culture
OUR DNA
We are an AdTech family where innovative ideas and new ways to do things are welcome, we reject "that's the way it's always been done". In Seedtag you can find an energetic, fresh workplace, multicultural work environment where our members are from different countries from Europe, LATAM and so many more!, where you will have the chance to impact directly on the company results.
Seedtag DNA is unique from the very beginning, we celebrate and embrace ersity, also we want all our members (They, he or she) to feel like home, all of the human differences are welcome.
SEEDTAGGER'S EXPERIENCE
"Do you want to be involved in the whole process of developing data technology? Then, Seedtag is your place. We are a growing team of techies who are pushing the data to every corner of the company. We are in charge of every aspect of the data pipeline, from the first POCs to the production deployments. If you are a curious inidual who loves playing with data and technology, then don't waste time, apply!" ( Sergio Rozada, Data Scientist at Seedtag)
SEEDTAG BENEFITS
- Key moment to join Seedtag in terms of growth and opportunities
- Career ladder plan for your professional growth
- High-performance tier salary bands excellent compensation
- One Seedtag: Work for a month from any of our open offices with travel and stay paid if you're a top performer (think of Brazil, Mexico..., )
- Paid travels to our HQ in Madrid to work p2p with your squad members
- Macbook Pro M1
- ⌛ Flexible schedule to balance work and personal life
- ⛰ An unlimited remote working environment, where you can choose to work from home indefinitely or attend our Madrid headquarters whenever you want, where you will find a great workplace location with food, snacks, great coffee, and much more.
- Build your home office with a budget of up to 1K€ (external screen, chair, table...)
- A harassment-free, supportive and safe environment to ensure the healthiest and friendliest professional experience fostering ersity at all levels.
- Optional company-paid English and/or Spanish courses.
- Access to learning opportunities (learning & development budget)
- We love what we do, but we also love having fun. We have many team activities you can join and enjoy with your colleagues! A Yearly offsite with all the company, team offsites, and Christmas events...
- Access to a flexible benefits plan with restaurant, transportation, and kindergarten tickets and discounts on medical insurance
Want to be a Seedtagger? Then send us your CV, we are waiting for it!
mod.io is a fast-growing, early-stage startup backed by leading gaming venture capital firms, working to bring user-generated content (UGC) to games and their players by offering a ready-to-go digital logistics solution that includes community tools, a web UI, and a REST API for any developer to integrate into their game.
The mod.io platform supports over 550,000 daily active users, and we are seeking a Data Engineer to help us scale the industry's best solution for exploring and installing UGC that will be embedded in some of the largest games in the world.
The mod.io service is centered around the REST API which allows developers to bring the modding and UGC experience into a game's UI rather than requiring players to run an external tool or integrate mods manually.
Why mod.io
The popularity of in-game content is exploding, it's in more games and on more platforms than ever before. We have experienced up to 20x growth in our key metrics, and have a database with over a billion rows, and plans to grow even larger.
So if you enjoy solving scaling challenges for a company expecting significant growth, and you want to work at a company and in an industry where data and data insights matter, then mod.io is the place for you.
Requirements
We are looking for a savvy Data Engineer to join our growing team. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection across 1,000s of data inputs.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer is also a hands-on software developer, able to write and modify the code to capture relevant data, massage it into meaningful insights and make these insights available via APIs. You will also ensure we have an optimal data delivery architecture that is consistent throughout all our projects.
You must be self-directed and comfortable supporting the data needs of multiple project teams.
You will be excited by the prospect of designing and optimizing our company’s data architecture to support our next generation of growth, products and data initiatives.
Your Responsibilities:
- Understand mod.io’s data needs and objectives
- Provide architectural and implementation leadership for data architectures, data warehouses, data lakes and other cloud-related data initiatives based on mod.io’s needs and objectives
- Build and maintain an optimal data ingestion & ETL solution
- Assemble large, complex data sets that meet functional and non-functional business requirements
- Explore and implement ways to enhance data quality and reliability
- Identify opportunities for (relevant) data acquisition
- Consider legal and regulatory obligations in regard to data capture and usage
- Identify, design, and implement internal process improvements: automate manual processes, optimize data delivery, (re)design infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
- Build or implement BI and query/reporting tools that utilize the data pipeline to provide actionable insights for key business performance metrics for presentation internally and externally (eg: customer dashboards etc)
- Work with project teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across international boundaries through multiple data centers and AWS regions.
- Provide data and analytics expertise as we strive for greater functionality in our data systems.
- In relation to our data pipeline architecture:
- Deploy and maintain production environments that require high availability.
- Monitor data servers to proactively identify performance issues, problematic trends and troubleshoot/escalate to resolve as appropriate.
- Drive the product towards higher availability and reliability & assist with on-call support on a rotating schedule for incident escalations (24x7).
- Ensure our data services meet stability, performance, and availability requirements.
- Monitor backups, usage, capacity, and performance of servers; liaise with users and/or vendors to address problems and changes in requirements.
- Build robust, self-healing features and automation that reduce operational effort and improve service up-time.
- Self-starter mindset with a strong drive to learn and own engineering initiatives to promote a culture of continuous improvement, and engineering excellence.
Qualifications
- Extensive experience working with relational databases, query authoring (SQL) and designing database backup and replication approaches
- Familiarity with a variety of database technologies
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Advanced SQL knowledge
- Expert ability to tune databases and optimise query performance
- Extensive experience in our core database technologies: MySQL, Redshift (PostgreSQL) and Memcached
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and MySQL
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Experience in one of the following Scripting languages: Python, PowerShell, Bash, Shell Script.
- Experience with monitoring and logging services (e.g. Elasticsearch, Wavefront, Uptime, Solarwinds, or similar).
It would be awesome if you also have:
- Data engineering certification (e.g AWS Certified Data Engineer)
- Experience with cloud-based infrastructure and services (AWS)
- Experience working in an agile environment
- Experience with Jenkins or similar build automation tools.
- Experience with Machine learning and AI over large data sets
- Experience with Trello and G-Suite
- A passion for video games.
Benefits
- Remote working is actively supported.
- Competitive salary plus equity.
- Remote working is actively supported.
- Flexible working hours and family-friendly considerations.
- Sit-stand desks, 27” monitor, ergonomic chairs.
- Regular social events.
- Experience new games, digital and tabletop.
- Attend international gaming conferences.
- Contributing to open-source on Github
Title: Healthcare Data Entry Specialist
Location: US National – Remote Full-Time
Representing clients on a variety of projects via inbound/outbound telecommunication.
This is your opportunity to join Ashfield, represent a top biotechnology company.
What’s in it for you?
- Temporary Project with opportunity to interview with other teams internally
- Competitive compensation
- Generous performance-driven Incentive Compensation package
- Competitive environment with company wide recognition, contests and coveted awards
Key Objectives:
Maintain excellent quality standards for all client programs; adhere to program guidelines. Accurately transcribe and data enter information required by inidual programs and correctly capture in specific program databases. Adhere to all company policies and Standard Operating Procedures. Display flexibility within department to maximize utilization. Exhibit highly effective transcription and data entry skills meeting or exceeding productivity expectations. Must safeguard patient privacy and confidentiality by following the guidelines set forth in the Privacy and Security Rules of the Health Insurance Portability and Accountability Act (HIPAA).Manage day to day activities of patient and health care provider support requests and deliverables across multiple communication channels i.e. Fax, Chat, eMail, etc.
Perform intake of cases and capture all relevant information in the Case Management system Ensure all support requested is captured within the Case Management system and routed to appropriate next step using decision tools and reference guides Ensure timely and accurate processing of requests including reviewing source documentation Escalate complex cases, when appropriate Maintain excellent quality standards for all client programs; adhere to program requirements and guidelines. Accurately transcribe and document information received via form into client databasesJob Holder Specification:
High School Diploma required Bachelor’s degree or equivalent work-related experience preferred. Excellent verbal, written and listening communication skills. Knowledge of reimbursement mechanisms for medical and pharmacy benefits, patient access processes and patient assistance programs: operational policies and processes preferred. Proficiency in reviewing intake documents thoroughly and entering information in database with little to no errors. Proficiency with Word and Excel Analytical thinking, problem solving and decision making. Ability to multitask and manage multiple parallel projects with strong time management skillsOfertia is a digital company based in lively Barcelona. Our mission is to revolutionize the way people do shopping, covering the gap between the online and offline shopping worlds. We help retailers to drive more customers to their stores by publishing digital circulars on our mobile apps and web portals, where consumers can find the best offers from their favorite stores around them.
Ofertia is now part of the Mediapost Group with strong presence in Spain. Moreover, we operate in Mexico, Colombia and Sweden.
We are looking for an experienced Data Scientist (m/f) who is, like us, passionate about delivering the best product and user experience to our customers. As a key member of our cross-functional and agile team setup, you will leverage your skills and ability to extract valuable insights from various data sets. You will work with a variety of challenges that touch all parts of the business. Our goal is to design a better product that customers will love, enabling us to get meaningful data for us to make the most accurate decisions.
What would you do at Ofertia?
Identify the best state-of-the-art algorithms and libraries to solve complex problems.
Ad-hoc data analysis to answer critical business questions and to identify potential growth and development opportunities.
Enriching the company’s data with third-party information when needed.
Improving data collection procedures to include information that is relevant for building better analytic systems.
Collaborate and contribute to functional, cross-functional groups, initiatives on methodologies, innovations, technology, IT infrastructure, etc. to enable broader and more effective use of data.
Improve our internal data science tools and frameworks.
Requirements
Strong programming skills in Python
3+ years of prior experience in a Data Scientist role including statistical modelling, simulation and analysis, machine learning algorithms like predictions, recommendations, object recognition
Good knowledge and experience in image recognition, text extraction algorithms
Good knowledge of libraries like Pandas, PuLP, Matplotlib, seaborn, scikit-learn
Experience working with Linear Programming and Mixed-Integer Linear Programming
Experience with TensorFlow 2
Good knowledge of SQL
Comfortable with Data Engineering skills
Good knowledge of AWS Services like EC2, S3, Data Pipeline
Basic knowledge of third-party API integration to read data from external sources
Experience in executing data science projects
Personal skills
Excellent communication and collaboration skills, able to explain complex effects and impacts of ML/DL insights in simple business terms. Entrepreneurial mindset, self-starter, and ability to operate independently
What makes working at Ofertia great
The opportunity to bring our IT platform to the next level.
Ofertia is one of the most exciting digital companies in the heart of Barcelona with a disruptive product.
We support your personal and professional development with challenging projects and training programs.
Flexible working hours and possibility to work 100% remotely (within Spain).
A company culture driven by pioneer-thinking and talent that crosses departments through flat hierarchies and short communication channels.
Who we are, what we do & why we do it
We are Dext. Our suite of tools makes accountants more productive, profitable and powerful. In doing so we give them back the most precious commodity, time, which they can then use to add greater value to their clients.
Accountants and bookkeepers are the backbone of every successful business. For more than a decade we’ve empowered our partners with innovative technology solutions to make businesses better. Dext allows them to meet the challenges they face today, tomorrow and in the future.
We are now seeking an experienced Product Manager to help with our continuous efforts to improve the level of automation we bring to our extraction processes.
The role (what you’ll do):
- Manage the data extraction lifecycle, you will be responsible for determining what 'success' looks like taking into consideration a number of macro and micro factors
- Research new ways to improve our automation processes; working closely with our machine learning lead and lead data scientist on requirements, data capture, sampling..ultimately using the tools at your disposable to spot actionable insights
- Channel the voice of the customer to ensure data outputs are in line with user goals
- Evaluate and implement new product ideas
- Work closely with the wider product team and stakeholders to align on product releases and team roadmaps
- Ensure that stakeholder needs are considered and evaluated during the development stages
About you (what we are looking for):
- Equally comfortable using reporting tools such as Looker, Snowflake, SQL as well as building simple data science models in a notebook.
- Successful candidates in the past have come from a data science background or product analysts who have applied data science techniques.
- Ideally previous Product Management experience in a fast-paced technology business. Some experience of advising/making product led decisions is needed.
- Ideally, previous experience in a SaaS, rapid growth environment
- Capable of making data driven decisions
- Stakeholder management
- Managing and prioritising a backlog
These are our ideal requirements, but we hire on potential, not just on experience, and we know that some people are less likely to apply for a role if they don’t meet 100% of the criteria. At Dext we are committed to cultivating a erse, inclusive and empowering culture, so please apply if you meet the majority of these competencies.
You can read more about our Diversity & Inclusion commitments here.
What you will be a part of:
We are a highly ambitious, innovative, market-leading FinTech. We are a global, well-funded business but have the dexterity and pace of a scale-up. We are uncompromising in our desire to achieve our best, day-in day-out, and we have three clear values which guide everything we do:
Be Brave - Everyone in the company has a voice to challenge ideas and the status quo.
Be Exceptional - We set high standards for ourselves. We aim to be exceptional at what we do.
Be Together - We are one team. There is no such thing as inidual success without team success.
DirectID is growing fast and thoughtfully. We are a remote first organisation and have been recognised as one of Scotland’s top 10 most flexible employers We know we're not just building world-class products but also a world-class team and a set of sustainable development practices that will continue to deliver value as we scale.
We believe that the best work comes from cross-functional, self-organising, and erse teams that take ownership of their processes and practices. You will be trusted to make things happen.
Our people set us apart and these are some of the words they use to describe our culture: Flexible, Fast, Compassionate, Potential, Professional, Fun.
Our values have grown organically through our behaviours and outcomes, driving everything we do We’re a Clan, With one Shared Vision, For our Customers and Brave at Heart.
YOUR ROLE
As a data analyst, you will work within our Data Insights team, close to our data scientists and product team. Your work will help us improve our analytical solutions, with your work having a direct positive impact to our customers experience.
We are looking for a Data Analyst, who will be instrumental in leading improvements to our machine learning models drive by annotation. You'll help build and improve our Machine Learning models and the processes related to their performance. Dealing with huge datasets, you will have ownership over the annotation and visualization tasks assigned to you, which will help shape our existing and future products.Requirements
YOU NEED TO HAVE
Some prior experience in linguistic annotation (text data annotation)
Demonstratable Power BI Experience and data visualization skills
A passion for data, efficiency, and accuracy
A curious nature, not being biased by previous knowledge
Experience delivering in data science/engineering environments.
Demonstratable experience and understanding of PowerBI
Some experience using a variety of data mining/data analysis methods.
Some experience with C#, Azure ML, SQL
Proactive management of inidual tasks alongside an open collaboration with the wider team.
Benefits
THE PERKS Benefits may vary depending on location.
- A team of passionate interesting people committed to your success.
- Challenging problems to solve.
- We're a growing company your contributions will be valued.
- £35k gross/pension/EMI share scheme.
- Continuous Professional Development budget (CPD).
- Uncapped bike to work scheme.
- Monthly recharge time.
- Clan events and workshops.
- Generous holiday allowance (we will insist you take it!).
- Home working contribution to set up (get comfy we want you to stay)
- Ask us about Flexible Working
REDEFINE AN INDUSTRY
We are on a mission to enable our customers across the globe to effortlessly make use of bank data to; better understand their customers, grow their business, revolutionise their offerings and delight with customer service.
At DirectID you will be working for a business that is redefining the industry for lenders and their consumers through our global credit & risk platform. This is an exciting stage in our growth and we’d love you to be part of the story.
MEMX is looking for a passionate, self-motivated and hard-working Data Engineer to join a fast paced and highly technical team. You will collaborate across many different functions who are end users of the data platform. You will contribute to innovative analytics and data utilization projects that will continue to differentiate MEMX from the competition and support regulatory compliance.
< class="h3">Responsibilities:
- Developing and maintaining queries.
- Developing and maintaining operational dashboards.
- General Administration of Data Platform software and servers.
- Generation and submission of reports.
- Resolving data platform issues.
- Scripting and building automation tools
- Partner with Operation, Dev, Member Experience team and others to understand data requirements
- Design, build and launch extremely efficient and reliable data pipelines to move data across a number of platforms to meet daily and monthly reporting requirements.
Requirements
- 3+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets.
- Programming experience in building high quality software.
- Java experience preferred
- Linux Scripting Experience
- SQL Experience
- Experience with designing and implementing real-time pipelines
- Experience with data quality and validation
- Experience designing data schemas
- 1+ year(s) of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
- Demonstrate ability to work well independently and within a fast-paced, collaborative environment
- Demonstrated communication and interpersonal skills to work across erse stakeholders and cross-functional teams
Benefits
Members Exchange (MEMX) is a growing FinTech firm founded by a group of leaders in the world’s financial markets and is currently the fastest growing U.S. equities exchange. Our people are the foundation of our business, and we are committed to maintaining the culture we have set in motion. We take great pride in our selection process — and that starts with finding the right people. At MEMX you will have the ability to work with a talented team of professionals who bring ersity of thought and background. You will have the opportunity to shape the future of our company and the impact MEMX will have on our clients and the broader markets. We offer competitive employee benefits and perks and will continue to make this a priority to attract the best.
· Fully Remote Workforce*
· Health Care Plan (Medical, Dental & Vision)
· Company Sponsored Retirement Plan
· Unlimited PTO
· Paid Family Leave
· Short-Term & Long-Term Disability
· Training & Development
· Wellness Resources
*Current list of approved remote work states:
- Connecticut
- Florida
- Illinois
- Kansas
- Maine
- New Jersey
- New York
- North Carolina
- Pennsylvania
- South Carolina
This is a remote position.
< class="h4" id="Overview" style="margin: 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">OverviewAnant is a destination employer for high-performing, erse, global talent. Our Data Engineers support the development, operation, and maintenance of real-time data processing. They oversee and deliver the success of client and internal projects. The Data Engineer will not only support our internal team, but will also participate in client project work including design of novel systems, debugging performance degradations and read/write latencies, audits, monitoring, and health checks. An ideal Data Engineering candidate will have experience supporting rollout of migration tooling through client environments by troubleshooting GKE, Airflow, Dataproc, DataStax Enterprise, and DataStax Astra. Other candidates will gain experience using these tools.
We are looking for a Data Engineer to join our team immediately. We look for the best and brightest and those willing to learn.
Soft Skills
-
Demonstrate a passion for excellence in work product and customer delivery
-
Create and deliver live and recorded demos for customers and internal stakeholders
-
Familiarity with the enterprise data platform ecosystem
-
Continuous learning mindset
Hard Skills
-
Troubleshoot and support rollout of tooling and services that use Airflow (on K8s), Spark (managed), DataStax Enterprise, and DataStax Astra,
-
Create, troubleshoot, and refactor Python DAGs,
-
Create and deploy infrastructure as code via Ansible and Terraform,
-
Demonstrate familiarity with creating and destroying resources on GCP, including GCP monitoring dashboards,
-
Demonstrate an aptitude for RCA and troubleshooting code and systems integration issues, and
-
Familiarity with Scala, Python, and Java.
-
Conduct rapid POC Development and be able to transfer knowledge to others
Most Wanted Qualifications
-
Certifications in Spark, Cassandra, Terraform, and/or Cloud Platform Services like AWS, GCP, or Azure
-
2+ years of relevant software design and development including the below as well as source control apps such as Bitbucket, Github, etc.
-
3+ years of relevant experience in Ansible, Docker, Prometheus, Grafana, Helm
-
3+ years engineering in Kubernetes-based environments as well as variants thereof (e.g., GKE)
-
5+ years of relevant software design and development in Terraform, Spark, Dataproc, Cassandra (including DSE, Astra, and other variants), and Google Cloud Platform
< class="h4" id="Responsibilities" style="margin: 1.357em 0px 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">Responsibilities
-
Work with multiple teams and multiple projects (e.g., application, infrastructure, cloud, etc.) to
-
Complete requests (adding new or decommissioning existing clusters)
-
Debug and resolve issues
-
-
Utilize project management software (e.g., Jira) to log time and resolve tickets
-
Create and update SOP’s, Runbooks, issue reports, and other documentation as required
-
Consult on client projects, maintain client confidentiality and protect client operations by keeping information confidential
-
Contribute to team effort by using effective communication skills, being a self-starter, and taking responsibility for deliverables
< class="h4" id="Qualifications" style="margin: 1.357em 0px 0px; padding: 0px; font-style: normal; line-height: 1.428; color: #172b4d; font-weight: 600; letter-spacing: -0.003em; text-transform: none; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">Qualifications
-
BS degree in Computer Science or related technical field involving coding, or equivalent practical experience
-
Ability to troubleshoot, debug and optimize code
-
Ability to identify and automate routine tasks
-
Incident management and root cause analysis
-
Strong organizational, time management, and detail skills.
-
Strong communication and interpersonal skills, able to comfortably and pleasantly deal with a variety of people
Working at Anant
-
Anant performs business around the clock, but some availability during during US Eastern Time business hours is important.
-
Anant is a 100% remote workplace.
-
Anant is currently looking to hire part time, with future full-time work available.
< class="h3">Benefits < class="h2" id="About-Anant" style="margin: 1.8em 0px 0px; padding: 0px; font-style: normal; line-height: 1.2; color: #172b4d; font-weight: 500; letter-spacing: -0.008em; text-transform: none; border-bottom-color: #cccccc; orphans: 2; text-indent: 0px; white-space: pre-wrap; widows: 2; word-spacing: 0px; background-color: #ffffff;">About Anant
Anant is working to become the authoritative market leader in business platforms. Most technology leaders have a hard time retaining the experts to help them build and manage global data platforms because of the high costs of specialized talent. We created a training program for client teams and a network of trained specialists on our framework who are available on a full, part, or on a project by project basis.
Anant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.
< class="h1">Skills
Certifications in Spark, Cassandra, Terraform, and/or Cloud Platform Services like AWS, GCP, or Azure 2+ years of relevant software design and development including the below as well as source control apps such as Bitbucket, Github, etc. 3+ years of relevant experience in Ansible, Docker, Prometheus, Grafana, Helm 3+ years engineering in Kubernetes-based environments as well as variants thereof (e.g., GKE) 5+ years of relevant software design and development in Terraform, Spark, Dataproc, Cassandra (including DSE, Astra, and other variants), and Google Cloud Platform
< class="h1">Experience4-5 years
NBCUniversal owns and operates over 20 different businesses across 30 countries including a valuable portfolio of news and entertainment television networks, a premier motion picture company, significant television production operations, a leading television stations group, world-renowned theme parks and a premium ad-supported streaming service.
Here you can be your authentic self. As a company uniquely positioned to educate, entertain and empower through our platforms, Comcast NBCUniversal stands for including everyone. We strive to foster a erse and inclusive culture where our employees feel supported, embraced and heard. We believe that our workforce should represent the communities we live in, so that together, we can continue to create and deliver content that reflects the current and ever-changing face of the world. Click here to learn more about Comcast NBCUniversal’s commitment and how we are making an impact. < class="h3">Job DescriptionWelcome to Peacock, the dynamic new streaming service from NBCUniversal. Here you’ll find more than a job. You’ll find a fast-paced, high-flying team for unique birds that want to be at the epicenter of technology, sports, news, tv, movies and more. Our flock works hard to connect people to what they love, each other and the world around them by creating shared experiences through culture-defining entertainment.
As a company, we embrace the power of difference. Our team is committed to creating an organization that champions ersity and inclusivity for all by curating content and a workforce that represents the world around us. We continue to challenge ourselves and the industry by being customer-centric, data-driven creatures of innovation. At Peacock, we are determined to forge the next frontier of streaming through creativity, teamwork, and talent. Here you can fly to new heights!As a member of the Peacock Video Quality & CDN Team, you will perform complex data analysis, design and create queries across multiple data sets and make them available for consumption through dashboards and/or ad hoc reports.
Working alongside Data Engineers and CDN Architects, you will deliver key insights and analytics that will help make better business decisions, improve KPIs and deep e on issues particular to video and ad delivery systems.Responsibilities
- Provide daily trend analysis on Core video KPIs
- Serve as the regional Video Metrics SME, interfacing with other technology domains and providing expertise to support, development and operations teams
- Work with front end RUM and backend KPI data
- Work with ML-based alerting and anomaly detection systems
- Play an integral role in the contribution towards the international supplier choice and commercial management strategy
- Bachelor’s degree in Computer Science, Information Technology or a relevant field
- Minimum five (5) years of experience working with video or similar data sets
- Solid understanding of telecommunication & Internet technologies
- Solid understanding of IP video chain (e.g. encoding, packaging, origin later; network integration)
- Experience in video performance management
- Ability to oversee communications concerning CDNs including presentations to executives and building detailed design documentation
- Experience with creating data visualizations to communicate metric performance.
- Experience with statistical analysis or data modeling techniques.
- Experience building data infrastructure and implementing reporting solutions.
- Experience with Google and Amazon Cloud tools is a plus
- Ability to manage multiple projects simultaneously and deal with ambiguity
- Excellent verbal and written communication skills
- Background in utilizing SQL/Python to extract, manipulate, and analyze datasets.
Desired Characteristics:
- Strong analytical skills with a ‘join-the-dots’ approach
- Knowledge of data analytics, modelling and statistical analysis
- Experience working as part of a geographically distributed team
- Proactive, independent, and able to articulate to a technical and non-technical audience.
- Should be a problem solver with an open mind and an eagerness to pick-up new skills
- Minimum five (5) years of experience working with video or similar data sets
- Solid understanding of telecommunication & Internet technologies
- Telecommunications industry experience (or similar e.g. academia). BSc / MSc / PhD in Computer Science, Electrical Engineering or similar
NBCUniversal's policy is to provide equal employment opportunities to all applicants and employees without regard to race, color, religion, creed, gender, gender identity or expression, age, national origin or ancestry, citizenship, disability, sexual orientation, marital status, pregnancy, veteran status, membership in the uniformed services, genetic information, or any other basis protected by applicable law. NBCUniversal will consider for employment qualified applicants with criminal histories in a manner consistent with relevant legal requirements, including the City of Los Angeles Fair Chance Initiative For Hiring Ordinance, where applicable.
If you are a qualified inidual with a disability or a disabled veteran, you have the right to request a reasonable accommodation if you are unable or limited in your ability to use or access nbcunicareers.com as a result of your disability. You can request reasonable accommodations in the US by calling 1-818-777-4107 and in the UK by calling +44 2036185726.
- Extract, combine, transform and store data from blockchains, exchanges and other sources utilizing modern data processing frameworks
- Maintain, improve and extend our data pipelines
- Independently identify new ways to arrange and enrich data to enable novel research and product directions
- Work closely together with our data scientists to get the most out of our on-chain and crypto-financial data
- Introduce the necessary tools to monitor and QA our data pipeline infrastructure
- Collaborate with our team of backend engineers and accompany code from ideation to implementation
- 4+ years working as a data engineer or a relevant field
- Extensive experience working with high-performant ETL data pipelines using state-of-the-art methodologies and tooling
- Advanced knowledge of large-scale data analysis with a great understanding of scalability and robustness
- High Proficiency in Python and its open-source data science ecosystem (pandas, numpy, matplotlib, ...)
- Very good knowledge of SQL (Postgres) and BigQuery
- Solid understanding and experience of microservices (e.g. using Docker and Kubernetes ) as well cloud infrastructure (preferably GCP)
- Passionate about Bitcoin and the crypto industry
- Working experience with financial (other time-series) data
- Experience with Graph analysis and graph databases
- Experience with Argo workflows
- Experience with golang and/of (py)spark
- A modern technical stack with an emphasis on quality
- Flexibility to organise your work and hours the way you like, our remote-first setup enables this for everyone
- Join a young, self-funded and already profitable company in a future-proof market
- Freedom to own your decisions and experiment, we need driven experts who help us figure out what to do, not for us to point at what needs to be done
- Be one of the main contributors to building a company with its unique culture in the cutting-edge tech space
- No approval loops or unnecessary processes, quick decision-making and full ownership of your function
- We are working on a kick-ass, meaningful benefits package, something truly useful and empowering - you will help us figure out what we'd need to stand out as the workplace of the future, we are open to any ideas
- Frequent company offsites, we love remote but love to have fun together too - this year we went to Lisbon for a week, we're also coming to Switzerland for our Xmas party and more adventures are coming
This is a remote position.
The mission of the Processing Team is to build cross-domain systems to perform RF-based data collection and geolocation. The Processing team includes experts across FPGA development, embedded software, software defined radio, and cloud development; plus deep knowledge of signal-of-interest (SOI) digital signal processing, RF communications systems, RF measurement systems, and geolocation. Our client is currently seeking a Senior Data Engineer who can help the Processing team design, build, and deploy data pipelines for RF processing and geolocation.
As a senior data engineer on the Processing team, you will be responsible for designing and implementing distributed, reliable backend software systems to consume and leverage RF data at scale. You will need experience building and running data pipelines in production, with a passion for robustness, observability, and monitoring. A successful data engineer will be expected to work closely with RF & Geolocation domain specialists, data scientists, and analysts to deploy pipelines while optimizing for both performance and low-latency. We support a broad range of software to accomplish our mission, especially favoring python and C++ for backend software; Kubernetes clusters on AWS; data pipelines orchestrated with Airflow; data storage with Amazon S3 and PostgreSQL as appropriate; Elasticsearch and Kibana for logs analytics and monitoring dashboards. Location: This position can be hybrid with work from home flexibility or 100% remote. Your main responsibilities will be:- Contribute to the design, implementation, and testing of the company's data platform and data pipelines; optimizing for scalable, low-latency deployment within a batch-processing cloud environment
- Build, document, and support software systems & tools (data pipelines, utility libraries, core features, etc) enabling high-quality research and production deployments throughout the team
- Define scope, build consensus within the technical team, and drive new feature development with input from stakeholders throughout the company
- Participate in collaborative & fast-paced software development practices, particularly performing merge request reviews, providing design feedback, etc
- Guide and mentor other inidual contributors; work closely with RF & Geolocation domain specialists to achieve the team mission
< class="h3">Requirements Education and experience:
- B.S. degree in Computer Science or comparable or equivalent experience
- 6+ years of professional experience
- 3+ years of experience building data pipelines and other cloud-based infrastructure: workflow management (e.g. Airflow, Argo workflows, AWS step functions), object storage, relational databases (specifically PostgreSQL, PostGIS, and experience writing/testing SQL), REST/GraphQL APIs, message passing (Kafka, SNS), etc
- Experience with data science and/or software development using python, especially using industry-standard standard python libraries: pandas, scipy, scikit, dask/ray, flask, fastAPI, etc
- Experience building software and tools facilitating effective research & development – a passion for writing clean code, scalable architectures, test-driven development, and robust logging
- Familiarity with CI/CD best practices: automated testing, using a dev/prod workflow, deploying to Artifactory or other package manager, deploying containerized software, etc.
- Track record of building and supporting mission-critical backend applications in production
- Experience administrating modern cloud applications and infrastructure running in Kubernetes on AWS or other cloud provider
- Working knowledge of frontend development (react/angular, javascript, web-assembly, etc), especially prior examples building proof-of-concept applications to consume & interact with data products
- Familiarity with the ELK stack (elasticsearch, logstash, kibana) for aggregating logs, creating queries/dashboards, and monitoring production deployments in real time
- Familiarity with software acceleration including multi-core parallelism, cluster-based scaling (e.g. Dask, Spark, etc), and/or GPUs, for bespoke applications
- Familiarity with RF signal processing or geolocation algorithms and applications, particularly in a batch-processed cloud environment
< class="h1">Skills
Education and experience: B.S. degree in Computer Science or comparable or equivalent experience 6+ years of professional experience 3+ years of experience building data pipelines and other cloud-based infrastructure: workflow management (e.g. Airflow, Argo workflows, AWS step functions), object storage, relational databases (specifically PostgreSQL, PostGIS, and experience writing/testing SQL), REST/GraphQL APIs, message passing (Kafka, SNS), etc Experience with data science and/or software development using python, especially using industry-standard standard python libraries: pandas, scipy, scikit, dask/ray, flask, fastAPI, etc Experience building software and tools facilitating effective research & development – a passion for writing clean code, scalable architectures, test-driven development, and robust logging Essential: Familiarity with CI/CD best practices: automated testing, using a dev/prod workflow, deploying to Artifactory or other package manager, deploying containerized software, etc. Track record of building and supporting mission-critical backend applications in production Desirable: Experience administrating modern cloud applications and infrastructure running in Kubernetes on AWS or other cloud provider Working knowledge of frontend development (react/angular, javascript, web-assembly, etc), especially prior examples building proof-of-concept applications to consume & interact with data products Familiarity with the ELK stack (elasticsearch, logstash, kibana) for aggregating logs, creating queries/dashboards, and monitoring production deployments in real time Familiarity with software acceleration including multi-core parallelism, cluster-based scaling (e.g. Dask, Spark, etc), and/or GPUs, for bespoke applications Familiarity with RF signal processing or geolocation algorithms and applications, particularly in a batch-processed cloud environment Company Overview: Our client is delivering a revolutionary source of global knowledge based on radio frequency (RF) geo-spatial analytics to those working to make the world a safer place. The company operates a commercial satellite constellation that detects, geo-locates, and identifies a broad range of signals & behaviors. We employ cutting edge AI techniques to equip our global customers with high-impact insights needed to make decisions with confidence. Headquartered in Herndon, Virginia. The client is committed to hiring and retaining a erse workforce. They are proud to be an Equal Opportunity Employer, making decisions without regard to race, color, religion, sex, sexual orientation, gender identity, gender expression, marital status, national origin, age, veteran status, disability, or any other protected class.
< class="h1">EducationBachelor's degree
- Maintain and extend existing and new applications to provide data and insights for different teams and products, including customers.
- Discuss, plan, segment incoming requirements from customers and other teams to realize new features and achieve the best possible outcome.
- Research new and alternative technologies to solve problems efficiently.
- Guide and support students and juniors in their work.
- Test and review your own and code of coworkers to minimize maintenance and increase code quality.
- Maintain documentation to allow other company members understanding your projects and to make troubleshooting easier.
- Contribute to a continuous improvement of the development process in the team.
- Provide assistance to other teams when interacting with data supplied by the data team to achieve useful outputs.
- Your main values match ours: ownership, collaboration, innovation, respect and authenticity.
- Min. 5 years experience in software engineering and data processing.
- Proactive personality who seeks information and is thinking outside the box.
- You write correct, elegant, flexible and performant code to keep your own bug fixing efforts low.
- Eager to take on responsibility, intrinsic motivation and reliability.
- Experience working with high level programming languages and test frameworks.
- Deep understanding of parallel and distributed systems.
- Familiar with challenges of high-volume stream and data processing, incl. message broker concepts (Apache Kafka).
- Deep knowledge of software engineering principles and agile development processes.
- High level of experience working with databases (SQL, NoSQL, Cache).
- Familiar with AWS, Kubernetes, Docker.
- Fluent in English, German is a plus.
- Valid work permit for country of employment.
- Work with the latest technology, in the cloud and with the latest frameworks.
- Share defined projects open source.
- An extremely steep learning curve.
- Impact, responsibility and participation in a very interesting field.
- Competitive pay according to your country and meaningful equity.
- Home office equipment or coworking space membership contribution.
- Flexible working hours.
- Fast-paced high-tech B2B company.
- Conference and travel budget.
- Fun team events.
- Additional benefits depending on your place of living.
- Ownership - Everybody takes the seat as the Co-CEO of Abusix to drive results proactively.
- Collaboration - We genuinely believe that we are stronger together as a team.
- Innovation - We love to challenge the status quo and want our people to be brave and embrace failure in a safe environment.
- Respect - Our daily interactions are guided through respect for each other, our customers, and the environment.
- Authenticity - We are true to our personalities and values, have fun at work and don't take ourselves too seriously.
Are you:
- A multi-faceted all-star who loves streamlining the accurate flow of data from one system to another through a combination of low-code tools and custom scripting?
- Someone who loves data visualization and has a knack for presenting data elegantly using modern Business Intelligence tools?
- Someone who loves details, and leaves no stone unturned when trying to solve a problem or check your work?
- Someone who has a unique combination of accounting, data science and engineering skills, but still operate with warmth, compassion and humility in everything you do?
If so, you might be the right person to work directly with the CEO to tackle some of the firm’s most challenging problems – how to efficiently access, clean, and move large amounts of data between software systems both internally and for the clients.
< class="h3">Job DescriptionSupport Existing Customer Automations
- You will quickly get up to speed on the data pipeline our client has built via Python, JavaScript, custom macros in Excel and Google Sheets and low-code tools
- You will manage tools that access customer data in SaaS apps via API or sometimes through manual report downloads when APIs are not available
- Your accounting knowledge will help you understand the data manipulation they must do before pushing customer data into QuickBooks Online via its API. You can then check your work with your understanding of Journal Entries and Debits and Credits
- In learning and supporting the existing automations they’ve built, you’ll be immediately looking for ways to improve the overall systems – using new tools or improved safe-guards to best serve customers’ needs
Support Our Client’s Existing Automations
- Similarly, you’ll support existing critical process automation they’ve built internally with Python, Google Sheets, APIs and data warehousing in Google Big Query. This work supports critical parts of monthly operations including billing, profit-sharing, etc.
- Working directly with the CEO, you’ll streamline both the creation and presentation of various internal reports we use on a weekly basis
Tackle New Automation Opportunities
- You’ll work directly with the team to seek out opportunities to further automate data services for clients. This will include seeking out automation opportunities outside of finance and accounting workflows. This may include automating parts of their marketing processes, field scheduling, inventory purchasing, etc. Over time, you’ll build out a team to scale the process automation services for clients
- Internally, you’ll work directly with the CEO to help implement and integrate a new CRM into their proposal and workflow management tools via thoughtful use of tools like Zapier. You’ll tackle other interesting challenges internally including more building systems for more real-time margin presentation and budget-to-actual analysis
- Fluency in Python and JavaScript and a strong understanding of low-code tools such as Zapier and Integromat. What you don’t know, you can learn quickly
- Strong familiarity with accounting, including an understanding of how a Balance Sheet and P&L work together. You are familiar with Journal Entries, Debits and Credits, and QuickBooks Online
- Access QuickBooks data, and connect it in relevant ways that make businesses easier to run and more profitable.
- Obsession with documentation and adherence to process. You love creating and following structured workflows, and get excited by helping others adhere to consistent processes
- Humility, patience, and a deeply rooted servant leadership mentality. You love to “get in the weeds” and avoid regular micromanagement
- Excellent communication skills – with both clients and team members across multiple communication channels (email, MS Teams, Zoom, etc.)
- Ability to quickly build expertise in cloud accounting technologies (QBO, Bill.com, Gusto, Divvy/Ramp, Expensify/Tallie, Salesforce, Google Drive, etc.)
- Resourcefulness – an expert problem solver who is not afraid to to ask for help when needed
- A keen attention to detail – with a sense of integrity, dependability, and joyfulness
What’s In It for You:
- 100% remote work – but you’ll also get to travel to conferences and to a once-a-year company gathering. Other key strategy sessions throughout the year will also be held in-person
- Generous compensation – including health and dental insurance, retirement matching, PTO, maternity/paternity leave, annual-wellness benefits and much more
- Joyful culture – team members love what they do, and that infectious energy permeates everywhere throughout the company.
- Endless growth opportunities – this is a fast-growing company on a decades long journey of evolution that loves team members who have a similar growth mindset and horizon
About The Firm:
-
We are a fast-growing, cloud finance and accounting services firm founded on two equal missions:
- Providing team members with their Dream Job. This means building a workplace where team members experience joy daily, are cared for and respected, afforded flexibility, given room to grow professionally and are compensated at the top of the industry
- De-stressing and freeing up clients via excellent and modern cloud-based finance and accounting services. We handle accounting, payroll, bill-pay, invoicing, reporting and other needs for customers so they can focus on their business, organization, family or other priorities.
If you feel you have the necessary qualifications to perform this job, please forward a current copy of your resume and state your salary requirements.
Paddle offers SaaS companies a completely different approach to their payments infrastructure. Instead of assembling and maintaining a complex stack of payments-related apps and services, we’re a Merchant of Record for our customers, taking away 100% of the pain of payments fragmentation. It’s faster, safer, cheaper, and, above all, way better.
In May 2022, we joined forces with ProfitWell. ProfitWell provides BI solutions that improve retention and monetization automatically through unmatched subscription intelligence. As one team and one platform, we offer the "done for you" approach to SaaS payments, billing, and growth.
We’re backed by investors including KKR, FTV Capital, Kindred, Notion, and 83North and serve over 3000 software sellers in 245 territories globally.
< class="h3">The Role:
The Data & Insights team are building the foundations to support Paddle's growth and demand for commercial data. This is an exciting time to join, where you will be part of a team migrating to a scalable BI stack and driving proactive business insights to impact our Go-to-Market strategy.
The Data & Insights team are responsible for all aspects of data modelling in the commercial functions, including but not limited to: business forecasting, revenue planning, demand generation pipeline, and ABM reporting.
The role of the Data & Insights Analyst is to support commercial functions in understanding their data, both on an ad hoc and project basis. This includes capturing and understanding business requirements, designing and deploying appropriate data models and presenting insights derived from this data to enable operational performance.
< class="h3">What you'll do:
- Contribute to the development of Data and Insights projects end-to-end:
- Work on scoped briefs under the guidance of the Data and Insights Manager and Senior Analysts.
- Work with CRM, Marketing data, product data, .csv files and several other data sources through usage of Excel and SQL.
- Create best-in-class dashboards and visualisations to drive maximum insight from Paddle’s data.
- Collaborate with other members of the Data & Insights, and broader Revenue Operations team.
- Become a subject matter expert on Paddle commercial data.
- Collaborate on the development of self-serve initiatives across the business to support both technical and non-technical teams with data discovery and drawing actionable insights.
- Contribute to strategic planning by deriving insights and recommendations from projects and reporting, to help shape the direction of the commercial business. Including but not limited to:
- Business Intelligence dashboarding.
- Data discovery.
- Development of industry knowledge and best practices.
- Customer revenue modelling
- Strong commercial experience in SQL and Excel. Experience with Python is a plus.
- Experience driving insights from complex datasets.
- Strong data visualisation and presentation techniques.
- Exposure to BI and ETL tools, such as Sisense, Looker, Tableau and PowerBI. Working knowledge of dbt and Snowflake is a plus.
- A proactive approach to challenges and are comfortable operating in an unstructured environment
- Proven project and stakeholder management skills
< class="h3">Everyone is welcome at Paddle
At Paddle, we’re committed to removing invisible barriers, both for our customers and within our own teams. We recognise and celebrate that every Paddler is unique and we welcome every inidual perspective. As an inclusive employer we don’t care if, or where, you studied, what you look like or where you’re from. We’re more interested in your passion for learning and what you’ll bring to the table. We encourage you to apply even if you don’t match every part of the job ad, especially if you’re part of an underrepresented group. Please let us know if there’s anything we can do to better support you through the application process and in the workplace. We’re committed to building a erse team where everyone feels safe to be their authentic self. Let’s grow together.
< class="h3">Why you’ll love working at PaddleWe are a erse, growing group of Paddlers across the globe who pride ourselves on our transparent, collaborative and respectful culture. We live and breathe our values, which are:
Exceptional Together
Execute with impact
Better than Yesterday
We offer a full suite of benefits, including attractive salaries, stock options, retirement plans, private healthcare and wellbeing initiatives.
We are a ‘digital-first’ company, which means you can work remotely, from one of our stylish hubs, or even a bit of both! We offer all team members unlimited holidays and 4 months paid family leave regardless of gender. We love our casual dress code, annual company retreats and much more. We invest in learning and will help you with your personal development via constant exposure to new challenges, an annual learning fund, and regular internal and external training.
< class="h6">#LI-REMOTEAs the Data SRE Manager you'll be responsible for managing the data infrastructure for all of Mozilla. You will work closely with Mozilla's Data Organization, which includes Data Engineering and Data Science teams, to develop and deliver a data platform to support Mozilla's data driven culture. Creative problem-solving skills and an innovative demeanor will be key to success.
As the Data SRE Manager you will...
-
Empower a hardworking team of SREs to grow in their roles and align the team’s work with Mozilla’s needs.
-
Own the infrastructure for the data platform, data tools, and integrations.
-
Manage and prioritize new projects, ongoing support, and technical debt.
-
Drive improvements from operations back into development and vice versa.
-
Measure, maintain, and report appropriate metrics across products to internal and external partners.
Your Professional Profile
-
Bachelor’s degree (or higher) in Computer Science and/or equivalent experience.
-
4+ years SRE experience, 2+ years SRE leadership experience.
-
Ability to accept direction and communicate effectively against multiple levels of management and technical expertise.
-
Solid experience in cloud environments, specifically AWS or GCP.
-
Working knowledge of database systems (SQL and/or non-relational).
-
Experience with ETL, data modeling, cloud-based data storage and processing, including GCP Data Services (Dataflow, BigQuery, Dataproc).
-
Experience in workflow and data pipeline orchestration (Airflow, Jenkins etc.)
About Mozilla
Mozilla exists to build the Internet as a public resource accessible to all because we believe that open and free is better than closed and controlled. When you work at Mozilla, you give yourself a chance to make a difference in the lives of Web users everywhere. And you give us a chance to make a difference in your life every single day. Join us to work on the Web as the platform and help create more opportunity and innovation for everyone online.
Commitment to ersity, equity, inclusion, and belonging
Mozilla understands that valuing erse creative practices and forms of knowledge are crucial to and enrich the company’s core mission. We encourage applications from everyone, including members of all equity-seeking communities, such as (but certainly not limited to) women, racialized and Indigenous persons, persons with disabilities, persons of all sexual orientations, gender identities, and expressions.
We will ensure that qualified iniduals with disabilities are provided reasonable accommodations to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment, as appropriate. Please contact us at [email protected] to request accommodation.
We are an equal opportunity employer. We do not discriminate on the basis of race (including hairstyle and texture), religion (including religious grooming and dress practices), gender, gender identity, gender expression, color, national origin, pregnancy, ancestry, domestic partner status, disability, sexual orientation, age, genetic predisposition, medical condition, marital status, citizenship status, military or veteran status, or any other basis covered by applicable laws. Mozilla will not tolerate discrimination or harassment based on any of these characteristics or any other unlawful behavior, conduct, or purpose.
Group: D
#LI-REMOTE
Req ID: R2002
We are interviewing and onboarding 100% virtually at this time. PagerDuty is focused on inclusion and employee well-being by building a culture that isn’t location specific and gives equal opportunity to everyone—regardless of where you are working. Unless your job requirements make it necessary to be in a company office, you may choose to work in-office, remotely, or hybrid.
PagerDuty is growing and we are looking for an experienced Data Engineer for our Data team in IT to manage and contribute to the software and services that we provide to our users. As a Data Engineer at PagerDuty, you will help lead the team responsible for designing, building, deploying, and supporting solutions for teams across PagerDuty's growing global user base. You are scrappy, independent, and excited about having a big impact on a small but growing team.
Together with the other members of the Data Platform team, you will have the opportunity to re-define how PagerDuty, designs, builds, integrates, and maintains a growing set of software and SaaS solutions. In this role, you will be working cross-functionally with business domain experts, analytics, and engineering teams to re-design and re-implement our Data Warehouse model(s).
How You Contribute to Our Vision: Key Responsibilities
- Translate business requirements into data models that are easy to understand and used by different disciplines across the company. Design, implement and build pipelines that deliver data with measurable quality under the SLA
- Partner with business domain experts, data analysts and engineering teams to build foundational data sets that are trusted, well understood, aligned with business strategy and enable self-service
- Be a champion of the overall strategy for data governance, security, privacy, quality and retention that will satisfy business policies and requirements
- Own and document foundational company metrics with a clear definition and data lineage
- Identify, document and promote best practices
About You: Skills and Attributes
- You will design, implement and scale data pipelines that transform billions of records into actionable data models that enable data insights.
- You will help lead initiatives to formalize data governance and management practices, rationalize our information lifecycle and key company metrics.
- You will provide mentorship and hands-on technical support to build trusted and reliable domain-specific datasets and metrics.
- You will have deep technical skills, be comfortable contributing to a nascent data ecosystem, and building a strong data foundation for the company.
- You will be a self-starter, detail and quality oriented, and passionate about having a huge impact at PagerDuty.
Minimum Requirements
- Bachelor's degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience
- 7+ years of experience working in data architecture, data pipelines, data modeling, master data management, metadata management
- Experience designing, deploying, and leading end to end Data platforms in a cloud-based and Agile environment
- Very strong experience in scaling and optimizing schemas, writing complex and performance tuning SQL and Data pipelines in the OLTP, OLAP and Data Warehouse environments.
- Advanced knowledge of relational databases and being capable of writing complex SQL.
- Experiences with Cloud-based Data Warehousing Platform such as SnowFlake or AWS Redshift.
- Experience with Data Pipeline and workflow tools such as Airflow including upgrades and administration tasks. Experience with AWS managed Airflow is a plus.
- Proficiency with one or more object-oriented and/or functional programming languages - Python/Scala/PySpark.
- Hands-on experience with Big Data technologies like Spark - Databricks Preferred.
- Experience with AWS - S3, SQS, Lambda, Athena and more.
- Experience with any of different ETL Tools like - Fivetran, Segment, Mulesoft or others.
- Knowledge of at-least one Data Visualization tool like Tableau, Periscope or Looker
- Knowldege of Salesforce and its processes is a big plus.
- A consistent track record of close collaboration with business partners and crafting data solutions to meet their needs
- Excellent written and verbal communication and interpersonal skills, able to effectively collaborate with technical and business partners
- Smart Inidual to pick up and jump into any new technology as and when needed and a good team player.
Preferred Qualifications
- Experience with Real time and other machine learning frameworks
- Familiarity with Infrastructure as a code - Terraform is a plus
Apply anyway! We extend opportunities to a broad array of candidates, including those with erse workplace experiences and backgrounds. Whether you're new to the corporate world, returning to work after a gap in employment, or simply looking to transition or take the next step in your career path, we are excited to connect with you.
< class="h3">PagerDuty OffersWe are dedicated to providing a culture where our people are happy, enabled and inspired to do their best. One of the ways we do this is by developing a comprehensive total rewards approach that supports employees and their loved ones. As a global organization, our programs are competitive with industry standards and aligned with local laws and regulations.
< class="h3">Your package may include:- Competitive salary and company equity
- Comprehensive benefits package from day one
- ESPP (Employee Stock Purchase Program)
- Retirement or pension plan
- Paid parental leave - up to 22 weeks for pregnant parent, up to 12 weeks for non-pregnant parent (some countries have longer leave standards and we comply with local laws)
- Generous paid vacation time
- Paid holidays and sick leave
- Paid employee volunteer time - 20 hours per year
- Bi-annual company-wide hack weeks
- Mental wellness programs
- Dutonian Wellness Days - scheduled company-wide paid days off in addition to PTO and scheduled holidays
- HibernationDuty - a week each year when everyone at PagerDuty, with the exception of a small, coverage crew, is asked to take a much needed break to truly disconnect and recharge
PagerDuty, Inc. (NYSE:PD) is a global leader in digital operations management, serving over 14,000 customers and 850,000 users worldwide, including 65% of the Fortune 100.
For the teams who build and run digital systems, PagerDuty is the best way to manage the urgent, mission-critical work that is essential to keeping digital services always on. We make it easy to handle any unplanned task, event, or opportunity, right away.
Led by CEO Jennifer Tejada, 50% of our board of directors is comprised of women, 45% of our managers are from underrepresented groups, and we are a proud member of the Pledge 1% Movement, committed to donating 1% Equity, 1% Employee time, and 1% Product to accelerate change in our communities. We are Great Place to Work-certified™ and our product is top rated in its category on TrustRadius.
From how we build our teams to who sits in the boardroom, we hope you can see yourself at PagerDuty.
Learn more: Social Impact; Inclusion, Diversity, & Equity; Culture
< class="h3">Additional InformationPagerDuty is committed to creating a erse environment and is an equal opportunity employer. PagerDuty does not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, parental status, veteran status, or disability status.
PagerDuty is committed to providing reasonable accommodations for qualified iniduals with disabilities in our job application process. Should you require accommodation, please email [email protected] and we will work with you to meet your accessibility needs.
PagerDuty verifies work authorization in accordance with the requirements of your local jurisdiction.
Learn more about our culture by checking us out on Instagram @PagerDutyLife!
We are looking for a Sr. Database Engineer to be responsible for designing, building, implementing, configuring, maintaining, debugging, and securing a wide variety of databases and clusters on Linux servers. This position will be in the Technical Operations team, will directly assist various engineering teams, and be part of an on-call rotation. Determining which database technology is appropriate for various workloads, moving data from one database to another, enhancing performance through configuration and schema modifications, and working in both cloud and on-premises systems will be frequent events.
Requirements
- Significant experience with at least one of the following database technologies, as well as knowledge of most others and a willingness to quickly learn more:
- MySQL/MariaDB
- Redis
- DynamoDB
- ElasticSearch
- PostgreSQL
- Experience operating databases with multiple layers of redundancy and scalability
- Experience with migrating data from one database type to another
- Experience operating databases on on-premises Linux servers and Cloud Services
Preferred Skills
- Familiarity with database libraries
- Familiarity with additional data store types:
- Hadoop
- Ceph
- MTBL
- Memcached
- Linux administration experience:
- RHEL/Centos/Rocky
- Debian/Ubuntu
Benefits
DomainTools is the global leader for internet intelligence and the first place security practitioners go when they need to know. The world's most advanced security teams use our solutions to identify external risks, investigate threats, and proactively protect their organizations in a constantly evolving threat landscape. DomainTools constantly monitors the Internet and brings together the most comprehensive and trusted domain, website and DNS data to provide immediate context and machine-learning driven risk analytics delivered in near real-time.
DomainTools embraces ersity, equity, and inclusion to its fullest as an equal opportunity employer. We build our teams so creativity and innovation can flourish. We believe inclusivity and equity fosters innovation and growth; and we harness this mindset to drive a culture that serves our employees and our customers. We encourage people of all backgrounds, ages, perspectives, and skill sets to apply; and do not discriminate based on age, religion, color, national origin, gender, sexual orientation, gender identity, marital status, veteran status, disability, or any other characteristic protected by law.
Who We Are:
Zearn is the nonprofit educational organization behind Zearn Math, the top-rated math learning platform used by 1 in 4 elementary students nationwide. Zearn Math supports teachers with research-backed curriculum and digital lessons proven to double the learning gains of a typical year of instruction. Zearn Math instructional materials - including 400+ hours of digital math learning - are free for teachers and families. Zearn also offers school- and district-wide licenses and professional development to support implementation. Everything Zearn does is driven by the belief that every kid can be a math kid.
Learn more about us at https://about.zearn.org/.
As a Data Analyst you will query and transform Zearn’s data to help teams at Zearn answer questions internally or report externally to partners (e.g. school districts.) If you thrive in a dynamic environment, love learning how things work and are an effective communicator, then we’d love to meet you. This is a unique opportunity to play a critical role in a fast-growing and widely loved organization that is dedicated to ensuring all children love learning math.
What This Role Will Do:
- Build and maintain dashboards in Looker and Hex
- Query our Redshift database
- Pull data from S3 buckets when needed
- Help increase the volume of requests we can handle, as well as the speed at which we process them
What You’ll Bring to the Role:
- Bachelor’s degree required (Computer Science, Math, or Data Science preferred)
- Data analysis and/or programming experience
- Experience with Python and/or SQL
- A true passion for helping people and creating positive experiences
- Desire to join a fast-paced environment at a mission-driven organization
Location:
This role is remote, but you may be asked to travel to our New York City office periodically for team building when it reopens.
Compensation & Benefits:
We offer a competitive benefits package, including comprehensive medical, dental and vision plans, short- and long-term disability, life insurance, 401K matching, parental leave, and a generous PTO policy.
To Apply: https://apply.workable.com/j/4B6A0EAD2C
Zearn is an equal opportunity employer. We celebrate ersity and are committed to creating an inclusive environment for all employees. All employment is decided on the basis of qualifications, merit, and business need.
Seeking an experienced Implementation Specialist!
Zone & Co's recent acquisition of Satori has provided us with many opportunities to continue to grow our (now combined) teams. Satori Reporting offers the only pre-built Power BI reporting solution for NetSuite. Together we can accelerate global NetSuite development and adoption with a broader product offering and extensive expertise in NetSuite and business operations.
About Satori ReportingWithin business applications, there is a treasure chest of data. Our mission is to enable our customers to transform it into translatable information that yields actionable insight and value to driving strategic business decisions. We facilitate enterprise-wide reporting with centralized data and a single version of the truth for our customers’ reporting needs. While Satori has pre-built data models for NetSuite, Salesforce, and Google Analytics/Ads (with more to come) and access to hundreds of pre-built connectors, we are excited to be NetSuite’s only pre-built business intelligence solution with a data model and full ETL processes that can support the most complex and customized configurations. We are proud to be certified partners of both NetSuite and Microsoft.This is an incredibly exciting and momentous time for Zone with our recent funding, and Fast Four and Satori acquisitions, so there’s a wonderful amount of opportunity to make your mark and make an impact.Job Description
As an Implementation Specialist, you will support the base product install for new customers during the implementation process. This includes technical implementation, product delivery, technical support, communication, and project management for the Business Intelligence solutions of Satori Reporting.
Primary Responsibilities
- Serve as a supporting point of contact for new customers during the implementation process
- Lead technical calls with customers and validate all requirements are met during the implementation process
- Responsible for technical implementation and product delivery in a well-organized and timely manner with adherence to service level agreements
- Complete set up and base product configuration using multiple tools such as NetSuite, Azure blob storage, Tabular Editor, and Power BI.
- Assist customers with user permissions, data source connections and review of standard reporting templates
- Provide support and assistance with all tasks related to the scope of implementation
- Investigate, troubleshoot, and analyze customer issues and requests to ensure prompt resolution or escalation to the appropriate team member in a timely manner
- Support ongoing customer success and training by assisting with follow up requests and questions during and after customer connection and go-live
- Maintain a positive attitude and the ability to adapt and learn processes, tools, or techniques quickly
- Contribute as a team player through participation and collaboration with other team members
- Provide exemplary customer service to each customer contact with professional communication and prompt, courteous responses
Requirements
- 1-2 years of related experience with reporting, data and analytics, and business intelligence operations
- Prior experience using Power BI, Power Query and Template Apps
- Basic knowledge of business intelligence product solutions
- Ability to multitask and meet timelines in a fast-paced environment
- Prior experience with project management tasks and successful product delivery
- Excellent verbal and written communication and interpersonal skills
Preferred Skills & Experience
- Bachelor’s degree in Business, Computer Science, Information Systems, Finance, Statistics, or a related field
- Prior experience with NetSuite and knowledge of saved searches and custom fields, records, and segments
- Prior experience with data validation
- Basic understanding of accounting concepts and financial reports
- Prior experience with Azure tools and cloud environments
- Prior experience with reporting tools such as SSRS, Power BI Paginated Reports, or Power BI Template Apps
Employment Type: Full Time/Exempt
Location: 100% Remote
Benefits
Benefits at Zone are all about helping you lead a fulfilling life outside of work. We know work is only one part of your life, so we do everything we can to support it. We offer fully paid parental leave. We celebrate no-Friday weeks during the summer (every other Friday off) and provide an initial stipend for setting up your home office. All on top of our fully covered, top-tier health insurance and unlimited vacation. And we are 100% remote!
We strongly encourage candidates of all different backgrounds and identities to apply. This is an opportunity for us to bring in a different perspective and we’re eager to further ersify our company. Zone & Co is committed to building an equitable, inclusive, and supportive place for you to do some of the greatest work of your career.
Zone and Co is an Equal Opportunity Employer committed to ersity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity, or any other factor protected by applicable federal, state, or local laws.
Seeking an experienced Business Intelligence Developer!
Zone & Co's recent acquisition of Satori has provided us with many opportunities to continue to grow our (now combined) teams. Satori Reporting offers the only pre-built Power BI reporting solution for NetSuite. Together we can accelerate global NetSuite development and adoption with a broader product offering and extensive expertise in NetSuite and business operations.
About Satori ReportingWithin business applications, there is a treasure chest of data. Our mission is to enable our customers to transform it into translatable information that yields actionable insight and value to driving strategic business decisions. We facilitate enterprise-wide reporting with centralized data and a single version of the truth for our customers’ reporting needs. While Satori has pre-built data models for NetSuite, Salesforce, and Google Analytics/Ads (with more to come) and access to hundreds of pre-built connectors, we are excited to be NetSuite’s only pre-built business intelligence solution with a data model and full ETL processes that can support the most complex and customized configurations. We are proud to be certified partners of both NetSuite and Microsoft.This is an incredibly exciting and momentous time for Zone with our recent funding, and Fast Four and Satori acquisitions, so there’s a wonderful amount of opportunity to make your mark and make an impact.Job Description
As a Business Intelligence Developer, you will assist with custom data and reporting solutions for Satori’s erse client base, including database modification, data modeling, disparate system integration, custom security solutions and roles, data governance, and custom reporting or analytics.
Primary Responsibilities
- Assist your assigned consulting team with custom development projects for your client portfolio
- Assist clients in identifying various required data sources, design solutions, and complete data integration or ingestion projects to expand the client’s data model or data warehouse
- Responsible for technical development using Microsoft’s BI stack and product delivery in a well organized and timely manner with adherence to project methodology and service level agreements
- Lead technical calls with clients to gather requirements and validate all requirements are met during the development process
- Responsible for timely and efficient development within the approved project budget and scope
- Investigate and troubleshoot client reported issues with data storage capacity, report performance and data refresh issues
- Contribute as a team player through participation and collaboration with other team members
- Maintain a positive attitude and the ability to adapt and learn processes, tools, or techniques quickly
- Provide exemplary client service to each client contact with professional communication and prompt, courteous responses
- Responsible for time reporting of all hours and adherence to targets and goals for billable hour utilization
Requirements
- 3-5 years of related experience such as product development, system management, and business intelligence operations
- 3-5 years of prior experience with relational databases, including SQL, and database objects such as views, stored procedures, functions, and their use in a business intelligence environment
- 3-5 years of prior experience using query development, scripting, and programming skills such as SQL, Power Pivot, DAX, M code
- 3-5 years of prior experience with ETL processes and data integration solutions
- 3-5 years of prior experience developing and supporting tabular data models using tools such asTabular Editor, DAX Studio, Visual Studio, Azure Analysis Services, ALM Toolkit
- 3-5 years of prior experience with version control processes using Azure DevOps and Git
- 3-5 years of prior experience using Power BI, Power Query and Power Platform Administration
Preferred Skills & Experience
- Bachelor’s degree in Business, Computer Science, Information Systems, or a related field
- Microsoft DA-100 Certification
- Expanded experience with data warehousing
- Prior experience in an Azure environment using Azure SQL Databases and Azure Data Factory
- Prior experience using NetSuite or similar ERP systems
- Prior experience with reporting tools such as SSRS, Power BI Paginated Reports, Power BI Template Apps
Employment Type: Full Time/Exempt
Location: 100% Remote
Benefits
Benefits at Zone are all about helping you lead a fulfilling life outside of work. We know work is only one part of your life, so we do everything we can to support it. We offer fully paid parental leave. We celebrate no-Friday weeks during the summer (every other Friday off) and provide an initial stipend for setting up your home office. All on top of our fully covered, top-tier health insurance and unlimited vacation. And we are 100% remote!
We strongly encourage candidates of all different backgrounds and identities to apply. This is an opportunity for us to bring in a different perspective and we’re eager to further ersify our company. Zone & Co is committed to building an equitable, inclusive, and supportive place for you to do some of the greatest work of your career.
Zone and Co is an Equal Opportunity Employer committed to ersity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity, or any other factor protected by applicable federal, state, or local laws.
Data Entry Coordinator
Reports to: Manager, Donor Relations
Position Status: Part-Time Contractor (20 hours per week) Location: RemoteThe Parkinson’s Foundation makes life better for people with Parkinson’s disease by improving care and advancing research toward a cure. In everything we do, we build on the energy, experience and passion of our global Parkinson’s community.
The Parkinson’s Foundation is seeking a goal-oriented and self-motivated professional to serve as a part-time contractor in the role of Matching Gifts Coordinator. This position is responsible for managing the Foundation’s Matching Gifts program by responding to matching gift inquiries, opt-outs, and confirmation requests. The coordinator will also provide support for entering all matching gifts into Raiser’s Edge and Luminate while ensuring that proper designations are applied.
It is essential that the Data Entry Coordinator demonstrates and upholds the Foundation’s values of collaboration, dedication, excellence, integrity, positivity, responsiveness, and teamwork.
Responsibilities
- Review and respond to all email inquiries received via [email protected]
- Responsible for entering all check and ACH matching gift and employee giving transactions into the database.
- Confirm all matching gift confirmation requests from employers and third-party facilitators.
- Enter Matching Gift pledges into Raiser’s Edge for all confirmed gifts.
- Update and maintain the Double the Donation database
- Assist in maintaining Raiser’s Edge donor data and perform maintenance cleanup of donor records when needed.
- Other duties as assigned by management.
Experience/Skills Required
- At least two years of professional experience in a development or data management position.
- High School diploma or two-year degree preferred
- Proven track record of excellent data entry skills
- Proficiency in computer software programs, including Outlook, Word, Excel, and aptitude for working with donor database systems.
- Must demonstrate creativity, reliability, responsibility, flexibility, and self-motivation.
- Able to multi-task while also being highly detail oriented.
- Strong written and oral communication skills in English.
Compensation
Compensation for this position is competitive and depends on prior experience.
The Parkinson’s Foundation is an equal opportunity employer. We are committed to ersity, equity, and inclusion in our culture and in our work on behalf of people with Parkinson’s disease.
All new hires are required to be fully vaccinated against the COVID-19 virus, subject to any legally required accommodations.
DomainTools recently demonstrated success establishing an infrastructure platform, bringing development practices up to speed with modern best practices and laying a uniform foundation for new applications. We need help extending this approach to a data platform - establishing a common baseline for how data is collected, processed, stored, and ultimately presented to customers. This role requires a combination of technical excellence, tenacity, and strong leadership skills.
The Team
The Architecture team at DomainTools is a small team of experienced engineers dedicated to building an efficient, sustainable, and reliable foundation for DomainTools' technical products. We come from erse backgrounds within the industry, including engineering, operations, and customer solutions.
Requirements
< class="h3">What you’ll do- Provide technical leadership over all aspects of how DomainTools handles cyber-security data including collection, analysis, governance, storage, and delivery
- Design and develop a data platform for current and future internet-scale datasets
- Collaborate with Research, Engineering, Operations and Product to design and develop features that empower security researchers to identify and respond to threat-actors quickly and accurately
- 8+ years of experience spanning some combination of technical implementation and technical leadership in a big-data adjacent role
- Expertise with real-time data collection, curation, governance, analysis, and storage
- Experience managing enterprise-scale data pipelines
- Experience bringing order to complex data systems - e.g. cataloging, schematizing, consolidating, documenting data systems
- Demonstrated success driving design and implementation of data solutions affecting multiple teams
- Experience optimizing data architecture to balance efficiency with maintainability and time-to-delivery
- Understanding of public internet infrastructure. Conversant in “how the internet works” and why cyber-security is a rapidly growing industry
Benefits
DomainTools is the global leader for internet intelligence and the first place security practitioners go when they need to know. The world's most advanced security teams use our solutions to identify external risks, investigate threats, and proactively protect their organizations in a constantly evolving threat landscape. DomainTools constantly monitors the Internet and brings together the most comprehensive and trusted domain, website and DNS data to provide immediate context and machine-learning driven risk analytics delivered in near real-time.
DomainTools embraces ersity, equity, and inclusion to its fullest as an equal opportunity employer. We build our teams so creativity and innovation can flourish. We believe inclusivity and equity fosters innovation and growth; and we harness this mindset to drive a culture that serves our employees and our customers. We encourage people of all backgrounds, ages, perspectives, and skill sets to apply; and do not discriminate based on age, religion, color, national origin, gender, sexual orientation, gender identity, marital status, veteran status, disability, or any other characteristic protected by law.
This is a full-time position with a preferred location in Western USA, Hawaii, or Alaska to ensure we have full coverage for the team without requiring overnight shifts. This role is remote, and English language fluency is a requirement.
Timescale🐯 is hiring a Database Support Engineer to our growing, international team! The right candidate will be comfortable providing deep technical support and interacting with customers on a day to day basis. You should have deep experience with relational databases (PostgreSQL a strong plus!), and must be able to quickly gain a detailed understanding of TimescaleDB. You should have a “can do it” attitude, where helping the user comes first.
Be familiar with TimescaleDB! You will get HUGE bonus points for signing up for the free trial https://www.timescale.com/mst-signup and test it out before you apply.
- Work with our customers across a wide range of topics from basic administration of TimescaleDB to deep consultative conversations around design, optimization, and implementation.
- Manage support cases from beginning to end.
- Develop and maintain close relationships with our customers to help them be successful.
- Be curious, always willing to learn, and always looking for ways to improve our products, our processes, and our culture.
- Bachelor’s degree in Computer Science, Information Systems, or equivalent experience.
- Relational DB experience at scale: administration, data modeling, query optimization, etc.
- Experience as a technical support professional with external customers.
- Strong aptitude to understand and learn new software products.
- Strong communication skills and the ability to express complex technical concepts to a varied audience.
- Be familiar with TimescaleDB! You should download and test it out before you apply.
- Familiarity with broad cloud computing concepts and experience with AWS, GCP, and Azure.
- Experience with Relational databases, NoSQL databases, and data modeling tools.
- Familiarity with PostgreSQL, including best practices, Query Plan analysis, backup/restore, migrations, etc.
- A strong sense of curiosity and a willingness to wade into the unknown and learn as you go.
About Timescale🐯Timescale is the creator of TimescaleDB, the industry-leading relational database for time-series. Tens of thousands of organizations trust TimescaleDB today with their mission-critical time-series applications. The company is dedicated to serving software developers and businesses worldwide, enabling them to build exceptional data-driven products that measure everything that matters: software applications, industrial equipment, financial markets, blockchain activity, consumer behavior, machine learning models, climate change, and more. Analyzing data across the time dimension (“time-series data”) enables developers to understand what is happening right now, how that is changing, and why that is changing. Timescale is a fully remote company with a global🌎 workforce and is backed by Tiger Global, Benchmark Capital, New Enterprise Associates, Redpoint Ventures, Icon Ventures, Two Sigma Ventures, and other leading investors. For more information, visit www.timescale.com or follow @TimescaleDB.
Working at Timescale🐯Timescale is breaking boundaries and setting new standards in the innovating and rapidly growing time-series data industry. Built on the foundation of people-focused values and principles, Timescale makes sure integrity, mutual respect, and compassion is at the heart of everything we do. Empowered by our Co-Founders, Ajay Kulkarni (CEO) and Mike Freedman (CTO), we are challenging the norm by working with people who continuously inspire and teach us 🤝.
Enjoy debating the crunch-factor of different chicken nuggets 🍗, sweating it out during lunch 💦, talking about your kids, whether they be actual children 👶🏽, potted plants 🌱, or four-legged creatures 🐾? You’ll fit right in at Timescale!
What we’re offeringBenefits may differ from country to country.
- Flexible PTO and family leave
- Fridays off in August 😎
- Full remote work from almost anywhere
- Stock options
- Monthly WiFi stipend
- Professional development and educational benefits 📚
- Premium insurance options for you and your family (US employees)
- FSA/Dependent FSA plans (US employees)
- 401(k) retirement plan (US employees)
Req. 2373
Remote, US, Office of CTO/Operations Teams, Full Time
Our mission is to unlock the collaborative power of communities by making Web3 universally easy to use, access, and build on.
Working with ConsenSys puts you at the forefront of an evolving paradigm, transforming our society for the better. We fundamentally believe blockchain is the next generation of technology that can lay the foundation for a more just and equitable society.
Blockchain tech is just over 10 years old. Ethereum itself is still a toddler and we’re far from reaching our full potential. You’ll get to work on the tools, infrastructure, and apps that scale these platforms to billions of users.
You’ll be constantly exposed to new concepts, ideas, and frameworks from your peers, and as you work on different projects — challenging you to stay at the top of your game. You’ll join a network of entrepreneurs and technologists that reaches the edge of our ecosystem. ConsenSys alumni have moved on to become tech entrepreneurs, CEOs, and team leads at tech companies.
About ConsenSys’s Company Data Team (Office of CTO & Operations)
ConsenSys Data sits within Consensys Software Inc. to help address all our variants of data, break down silos, enable best practices, provide first rate resources, and accelerate our mission of becoming a cutting edge data driven organization. We are using a mix of providing some centralized data engineering functions as a shared service, while enabling our business units to make great data decisions with their own data functions.ConsenSys Software Inc. is a wide organization with each inidual business unit facing unique data challenges. Infura needs to provide real time analytics on top of a data pipeline doing billions of events per day. MetaMask Swaps needs to provide financial accounting for a purely on chain data source. Truffle needs to track developer engagement across the open source ecosystem.
What you’ll do
We are looking for a data engineer to join our shared data engineering team, with a goal of helping to build, maintain and evolve our data warehouse to support the organization.
You will join a team to work with the business to ensure we have a first class data warehouse supporting our business units and our business decision making. Some of the key areas you will help ensure we are doing well are:
- data quality
- data governance
- master data management
- Understanding of the business and data strategy
- Contribute to the collection, storage, management, quality, and protection of data
- Implementing data privacy policies and complying with data protection regulations
- Effectively communicate the status, value, and importance of data collection to executive members and staff
- Knowledge of relevant applications, big data solutions, and tools
- Knowledge of real time streaming data pipelines
- Governance: Advising on, monitoring, and governing enterprise data
- Operations: Enabling data usability, availability, and efficiency
- Innovation: Driving enterprise digital transformation innovation, cost reduction, and revenue generation
- Analytics: Supporting analytics and reporting on products, customers, operations, and markets
Who we’re looking for:
- Personal and/or Professional involvement in web3 (crypto, tokens, NFTs, dev tooling, dapps, DAOs, blockchain, courses, etc.)
- 5+ years of overall working experience in an enterprise engineering domain
- Tech stack you'll work within: Python, SQL, LookML/Looker, BigQuery
- Preferably focused on a company's operational data (pulling, analyzing, ETL, consolidating, integrating, building out pipelines, etc.)
- Experience collecting external on-chain market data from the web3 environment.
- Proficient in Python and SQL programming (building scripts from scratch)
- Well versed with popular frameworks like Pandas, Flask, Airflow, Apache Beam etc.
- Experience with any one cloud technologies i.e Google BigQuery (GCP), AWS or Azure (GCP preferred)
- Hands on experience with Cloud function, Big Query, DataFlow, Cloud Run, Airflow etc.
- Strong with Linux Commands and Shell Scripting
- Experience with Docker & Kubernetes
- Experience building CI/CD pipelines
- Knowledge of any one SCM tools like Git, BitBucket etc.
- Strong knowledge on Terraform scripting
- Enthusiasm for shipping high-quality code and helping peers do the same
- Understanding of web development practices and terminology
Bonus points:
- Hands-on experience with Kafka
- Hand-on experience with Apache Spark (py-spark preferred)
- You’re a MetaMask user!
Don't meet all the requirements? Don't sweat it. We’re passionate about building a erse team of humans and as such, if you think you've got what it takes for our chaotic-but-fun, remote-friendly, start-up environment—apply anyway, detailing your relevant transferable skills in your cover letter. While we have a pretty good idea of what we need, we're ready for you to challenge our thinking on who needs to be in this role.
ConsenSys is an equal opportunity employer. We encourage people from all backgrounds to apply. We are committed to ensuring that our technology is made available and accessible to everyone. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. ConsenSys is aware of fraudulent recruitment practices and we encourage all applicants to review our best practices to protect yourself which can be found https://consensys.net/careers/best-practices-to-avoid-recruitment-fraud/.
#LI-Remote #LI-EC1
About Alchemy Worx:
Alchemy Worx is an audience management agency specializing in email marketing, SMS marketing, and paid social. The agency offers businesses advanced marketing services, utilizing time-tested methods, and procedures to secure a higher rate of customer retention.
In the reporting and analytics process, the team dissects and analyzes the email list, mailing style, and frequency to determine the campaign quality, health, and effectives for email and SMS. After reviewing all KPIs, the analytics team builds predictive models for benchmarking purposes and creates a strategy to hit those benchmarks. In addition, they will look at Facebook Ad Manager to analyze how each campaign is doing and find room for optimization.
We are looking for a tech savvy person who is ambitious and an aggressive self-starter. The ideal candidate should have the ability to manage data analytics, run reports, and excel in a client-facing role.
What You’ll Do:
- Manage an efficient, comprehensive, and accurate reporting & analytics reporting process done by on-shore and offshore teams
- Perform comprehensive analyses on large sets of email marketing and Facebook data to extract actionable insights to help drive customer and internal decisions
- Communicate data-driven insights and recommendations for client and internal use
- Develop thorough understanding of email user data flow and recommend best practices to streamline data flow, processes and related standards
- Develop and maintain internal and external controls to store and retrieve client data, reports, marketing material, and other data
- Work closely with cross-functional teams of data analysts, designers, account strategists, and others across the company who are passionate about the company’s success
- Develop self-serve reports, internal controls, and proofing checklists to monitor customer email data hygiene
Who You Are:
- You have a deep passion, curiosity, and understanding of advanced data analysis and a strong business sense
- You are a self-starter, problem solver, resourceful and are able to work with limited supervision
- You are flexible and able to manage multiple tasks and deadlines
- You have the ability to work within globally dispersed team
- You are a strong verbal and written communicator and excellent collaborator that values building strong relationships with colleagues, clients, and other stakeholders
- You have the ability to understand complex topics/processes/problems and explain them to your colleagues in simple terms
- You are customer, results, and solutions oriented
Requirements
- 2 years of experience in a role that required advanced analytics, extensive reporting, and abstracting insights/recommendations from large datasets
- University Degree in science, computer science, statistics, economics, mathematics, or similar quantitative discipline
- Excellent knowledge of Excel (VLookups, Index, Match, SUMIFS, COUNTIFS functions a must)
Preferred Skills / Experience:
- Project Management Experience
- Management and communication with globally dispersed teams
- Coding and Database Skills
- Knowledge of Tableau or Data Studio
- Knowledge of Email Service Provider Platforms and email campaign building
- Knowledge of SQL, Python, or R
- Ability to build predictive models and cluster analysis models in Python OR R
Benefits
- Unlimited PTO policy
- Fully covered medical, dental and vision insurance
- 401k with company match
- Hybrid team, with both main office and remote team members
Sinch is a global leader in the growing market for Communication Platforms as a Service (CPaaS) and mobile customer engagement. We are specialists in allowing businesses to reach everyone on the planet, in seconds or less, through mobile messaging, email, voice, and video.
More than 150,000 businesses, including many of the world’s largest companies and mobile operators, use Sinch’s advanced technology platform to engage with their customers. Moreover, Sinch has been profitable and fast-growing since its foundation in 2008.
Sinch's core values are Make it Happen, Dream Big, Keep it Simple and Win Together. These values describe how our global organization works and inspire every of our more than 3,000 employees across 55 different countries.
Sinch is organized into five business units: Applications, Enterprise & Messaging, Voice, Developer & Email, and SMB. This role will be under Applications.
The Applications team is responsible for delivering innovative solutions through conversational platforms and next-generation communication channels like WhatsApp, Instagram, Telegram, Apple Business Chat, Line, Kakao Talk, and other telecommunications channels like SMS, MMS, and RCS.
The essence of the role.
As an analytics engineer, you will be working alongside the analytics team to be immersed in the business challenges and opportunities of our enterprise clients. You will support the delivery of insights through analytic warehouse support – sometimes visionary and bold, other times practical and pragmatic – that will help brands communicate more effectively using mobile messaging.
You will use tools like Snowflake, AWS, SQL, Python, and Tableau to support the delivery of testing strategies, analysis, research, listening, empathy, and storytelling to cultivate an informed POV that addresses the unique challenges of the brands we work with. You will be a strategic partner to our analytic team and a collaborative partner with our internal product engineering teams.
In this role, you will:
- Manages the portfolio of new capabilities for the warehouse, and partners with analytics on prioritization and gather requirements.
- Maintains partnerships with upstream product engineering teams and client-side analytics teams, establish SLAs, monitor that data delivery is in alignment with SLAs, and has a working knowledge of the source system APIs and data passes from systems (ex. Fiddleback, Chatlayer).
- Responsible for modeling data to provide clean, accurate datasets so that different users within the analytic team.
- Develops the ETL, physical and logical layers of the warehouse for the purpose of being consumed in other analytic tools (ex. Tableau, Python, etc.).
- Onboards new client data, platform connections, or new business rules that require adjustments to the data architecture/views within the warehouse.
- Daily monitoring of data feeds/refreshes within the warehouse, and troubleshooting issues as needed.
- Train analysts and maintain documentation of the table structures, data dictionary, and change logs for the data warehouse.
- Troubleshooting platform and addressing data errors, failed updates, and other issues that may occur from upstream systems.
- Develops and monitors event file and other custom reports that are delivered back to clients for engagement data from our platforms as needed.
Requirements
- 5+ years of related experience working in a data warehousing, data engineering or data architect role.
- 5+ years of experience building data warehouses designed for analytic teams, managing product roadmaps, and experience in a marketing analytical role is a plus.
- Strong skills in Snowflake and AWS as well as data languages like R and Python to handle data orchestration tasks.
- Demonstrated ability to integrate campaign platform tools with analytics tools to gain an end to end view of the campaign experience; supporting analysis, reporting and optimizations.
- Deep experience in developing ETL, data analysis and report generation through tools such as Excel, Tableau, PowerBI, R, Python, AWS, and Snowflake.
- Experience with conducting single and multivariate campaign testing.
- Experience working in marketing analytics or experience building solutions to support campaign reporting and insights a plus.
- Experience working independently, self-managing, and managing others on projects to deliver results supporting multiple clients and stakeholders.
- A degree in statistics, mathematics, computer science, software engineering, or IT is a plus.
Benefits
Sinch is a global company composed of people from different countries and cultures. Our benefits adjust regionally to support employees and help them to thrive in every stage of life. We offer valuable benefits and resources, including health and life insurance, a flexible work environment, retirement savings plans, and more.
Our Hiring Process.
In Sinch, we are committed to following a recruitment process that is fair, objective, consistent, and non-discriminatory. Our Talent Acquisition team, together with hiring managers and the rest of the interviewing team, persistently work towards identifying the candidates that best fit each open job, based on Sinch’ s hiring needs and candidates’ career expectations.
We encourage applications from strong candidates with relevant professional backgrounds for this role. Not all applicants will meet all job requirements exactly! Even if you do not meet all job requirements, don't let that stop you from considering Sinch for the next step in your career. We are always open to candidates that could bring new ideas and perspectives to Sinch!
Data Entry Transcription
Location: United States
Work from Home/Remote Full-Time
$15.00 – $16.00/Hour
Job Details
Now hiring for remote data entry transcription! Take advantage of this great opportunity with a strong and growing company with potential room for growth!
Remote work opportunity!
Job Details for data entry transcription:
- Schedule: Remote Training 8:00am-4:30pm. Flexible 40 hour remote schedule after training.
- Pay Rate: $16
Job Responsibilities & Description for data entry transcription:
- Transcribe calls including but not limited to: satisfaction survey responses, medical conditions, prescription drug names, and member information
- Transcribe recorded audio from phone-based interaction
- Escalate or transfer calls as needed
Job Requirements for data entry transcription:
- Able to type at least 50 wpm accurately
- Ability to work independently with minimal supervision
- Proficient with Microsoft Word, Excel, Outlook, Internet Explorer
- Bilingual is a plus
Transaction Coordinator
United StatesOperations (COO
Full time
Remote
Description
Would you like to be part of a growing national healthcare solutions company? Are you looking to positively affect thousands of lives each day via health benefits?
We are hiring for a Transaction Coordinator to join our team.
Who we are
Allied is a national healthcare solutions company that supports healthy workplace cultures.
What we do
We are problem-solvers, innovators, and collaborators. Our purpose is to work with employers to take care of their employees and their families every day and it all starts with the Allied family.
What’s in it for you?
Allied supports an inclusive culture focused on developing employees to succeed, innovate & impact the community.
Here’s how we do it
Training and Development: Allied offers tailored learning and development curriculums for all employees and a Learning Management Database with thousands of courses for professional and personal development.
Career Mobility: Growth opportunities are endless at Allied. In 2021 alone, one in five employees had a job change. 75% of these job changes were promotions!
Employee Engagement:We pride ourselves on employee engagement! With our recognition program, employees recognize their colleagues monthly or donate to charities with cash rewards. Allied has a dedicated committee planning monthly engagement activities to create endless opportunities to get to know your peers and destress in this new remote world.
Employee Feedback: We regularly survey our employees throughout the year to seek continuous feedback, ideas and suggestions on new initiatives.
Community Outreach: We have dedicated committees focused on fundraising efforts supporting our employees and their families, furthering education goals and providing funds for charitable organizations outside of Allied.
What will you be doing?
The Eligibility Services Department at Allied is responsible for the day to day operations of the following services: Eligibility, Prescription Benefit Manager (PBM), Flexible Spending Account (FSA), and COBRA. The Transaction Coordinator, Eligibility Services is responsible for accurately processing eligibility data and assisting with issue resolution.
ESSENTIAL FUNCTIONS:
Input eligibility data and update member records in various technology platforms
Terminate member records in eligibility programs in various technology platforms
Process annual enrollment changes for assigned groups
Assign member UID’s
Review and audit Eligibility, PBM, FSA, and COBRA transaction and maintenance reports daily
Review eligibility audit reports and determine if member updates are required
Identify transaction related processing errors
Process ID card requests for assigned groups
Process FSA debit card requests for assigned groups
Process void and reissue payment requests within our FSA processing system
Assist with various Eligibility Services projects
Perform additional tasks and duties as assigned
SKILLS & ABILITIES:
Intermediate level work experience with Microsoft Office, Word, Excel, and Power Point software applications.
Education:
Some college preferred
Experience:
2-4 years related experience
Previous experience with data entry preferred
Certificates & Licenses:
N/A
Physical Demands:
None
WORK ENVIRONMENT
Fully Remote
Work Schedule:
Flexible, during normal business hours, Monday-Friday.
Hiring is contingent upon successful completion of our background and drug screening process. Allied is a drug-free and tobacco-free workplace.
Diversity creates a healthier atmosphere: Allied is an Equal Employment Opportunity / Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law.
Verifications Specialist ($15-$17 per hour)
Fully Remote Remote (Fully Remote) Mortgage
Full-time
Description
The Verifications Specialist is responsible for collecting and validating information presented by clients and customers by communicating with various institutions to verify the correct information.
Responsibilities
- Re-verify employment and assets by communicating with various employers/financial institutions
- Enter client information into computer systems and databases
- Order management of additional services such as SSA89, Tax Transcripts, Occupancy Verifications, Field Review Appraisals
- Collects, researches, and analyzes data
- Review documents to determine any necessary re-verifications
Requirements
Qualifications:
- High School GED or equivalent or any combination of education and experience; mortgage lending experience preferred
- Ability to read and interpret documents such as company policies; ability to write routine reports and correspondences; able to communicate professionally and clearly through both written and oral correspondences
- Experience with Microsoft Office, such as Microsoft Word, Microsoft Excel, and Microsoft Outlook
Employment is contingent upon completing and passing a background check and drug test. MetaSource is an equal opportunity employer.
ecosio is a fast-growing, innovative service company and a leading provider of B2B integration, specialising in electronic data interchange (EDI), supplier relationship management (SRM) and e-invoicing.
Our brand slogan is Connections That Work, which refers not only to the reliability of our EDI connections on a technical level but also to our long-lasting relationships with partners, customers and colleagues on a personal level. We are technology lovers, set the highest standards for our solutions, and put innovative ideas first.
< class="h3">Job DescriptionAre you passionate about finding new ways how to extract and present data that will influence business decisions?
You'll connect with the role if you enjoy...
- driving data warehouse architecture design, modeling and ETL development
- collecting and combining data from various sources (Engineering, Finance, Sales and Marketing) and present them to be business relevant
- making the data available and visualizing insights for everyone in ecosio and external partners
- providing technical mentorship to non-Data-Engineer-ecosians to create their own reports
- building advanced analytical models (behavior segmentation, churn prediction, purchase propensity, recommendation engine, etc...) for engineering, finance, sales and marketing teams
To connect with ecosio it is important to have…
- several years of experience with designing data lakes/warehouses based on cloud technologies
- Master Degree in Data Science, Computer Science, Statistics, Mathematics
- experience in performing data analysis, data ingestion and data integration
- solid communication skills and teamwork attitude to contribute to multi-functional teams
By connecting with us you will experience...
- flexible working hours with a 100 % remote working opportunity
- open corporate culture and flat hierarchies
- employee events and happiness team for little smiles along the way
- ongoing training and development
- passionate team that will win your heart
Sounds like a connection that works? Then apply by clicking "I'm interested" and we will get in touch soon!
The minimum salary according to the Austrian collective agreement is 37.150€/year. Of course, we are happy to offer much higher compensation based on qualifications and experience!
Sound is hiring a Data Infrastructure Senior Engineer to help shape the future of a new music economy that values artists and their music while connecting fans more closely to the music they love.
Sound is a suite of web3-native music and economic tools powering the next generation of artists and their communities. We’re passionate about helping artists capture more value from their art, and connecting fans more closely to the music they love. Since launch, we’ve onboarded over 200 artists (including Snoop Dogg, Pussy Riot, Salem Ilese, RAC, Soulection, and more) and generated over $3.5 million in proceeds that have gone directly to those artists.
As a Data Infrastructure Senior Engineer, you’ll be responsible for building and maintaining the end-to-end data infrastructure and architecture that powers our product and data initiatives. You’ll play an instrumental part in architecting and scaling our infrastructure to support growth, defining the way product performance is measured, all while raising the bar for how high-quality and high-velocity decisions are made at Sound.
< class="h3">What you'll be doing:
-
Build, maintain, and consistently evolve our data model and data schema to meet business and engineering requirements.
-
Scale our data infrastructure with a DevOps mindset to support growth while maintaining performance and reliability.
-
Support warehousing and analytics customers that rely on our data pipeline for analysis, modeling, and reporting.
-
Manage data monitoring and alerting tools to provide more visibility about data quality, scalability and performance metrics.
-
Design and develop disaster recovery plans to ensure data integrity.
-
Work across teams to identify and solve pain points in our data architecture.
< class="h3">Who we're looking for:
-
5+ years of experience building and operating critical data infrastructure.
-
Proven track record of architecting large-scale data processing and proficiency with building and operating robust data pipelines using modern technologies such as AWS Kinesis, AWS Glue, Athena, PostgreSQL, etc.
-
Experience with the AWS data ecosystem, including ECS, SNS, EventBridge.
-
Experience championing quality through thoughtful code reviews, thorough testing, rollout execution, monitoring, and proactive ownership of code bases.
-
Ability to communicate and discuss complex topics with technical and non-technical audiences.
-
Ability to tackle ambiguous and undefined problems.
< class="h3">Nice-to-haves:
-
Experience working with microservices.
-
Experience scaling existing codebases.
-
Experience with modern application frameworks (React, Relay, Next.js).
-
Experience integrating with various web3 technology including Ethereum, Arweave, IPFS architectures.
-
History of open source contributions.
< class="h3">Benefits at Sound:
-
We offer top-of-the-line benefits, including health, mental health, dental, and vision insurance.
-
Remote-first teamwork with team and community members around the world
-
Work-from-home/remote office stipend
-
Team offsites for periodic collaborative strategy sessions in person
-
Passionate, supportive team dedicated to learning and growing together in web3
Sound is an equal opportunity employer. We do not discriminate based on gender, ethnicity, sexual orientation, religion, age, civil or family status, disability or race.
ClientSolv Technologies is an IT solution firm with over a decade of experience serving Fortune 1000 companies, public sector and small to medium sized companies. ClientSolv Technologies is a woman-owned and operated company that is certified as a WMBE, 8a firm by the Federal government's Small Business Administration.
< class="h3">Job DescriptionWe are seeking a Data Analyst for a 12 month contract (with the option to extend further) in Denver, CO. This role can work remotely for residents of Colorado. This role will ask for the selected candidate to reside in Colorado due to program regulations.
This role will be responsible for the transformation of raw data into actionable information that departments will use to improve case management activities. This position uses data management software/applications such as MS Excel, TOAD and SharePoint to complete the required work.
Primary Job Responsibilities
- Pull down Protected Health Information (PHI) data from Department interface files.
- Cleanse, organize, and analyze the PHI data based on member affiliation to vendors contracted by the Department.
- Ensure the files are complete, current, and stored securely and appropriately.
- Provide training and technical assistance for users accessing the files.
- Present complex technical information on file access processes, data file design, and data exchange information, in a way that can be understood by audiences of varying technical mastery.
- Maintain user friendly files and provide actionable information.
- Implement different types of data provision, presentation and visualization, using applications like MS Excel.
< class="h3">Qualifications
- Experience with Data analysis
- Working knowledge of SQL
- Expertise in MS Excel and MS SharePoint/Teams.
This contract role will start off as a 12 month contract, with a strong chance for contract extensions, and can work remotely to candidates who reside in Colorado. This role will ask for the selected candidate to reside in Colorado due to program regulations.
At Jimdo, we're big on small. Our mission is to unleash the power of the self-employed and small businesses—and help them thrive. Small businesses are the backbone of the global economy, but they receive little support or recognition. We see them and are here to support them. Join us to help design intuitive tools that enable small businesses to solve complex problems.
We run at a steady pace to achieve what we aim for. We learn best by digging deep into data, staying curious, taking calculated risks, and sometimes even falling down along the way. It's the lessons we learn in the process that make us better problem-solvers for small business owners.
If you're motivated by our mission and excited to roll up your sleeves, experiment, learn from mistakes, and make a difference to small businesses around the world, we would love to get to know you.
< class="h2">The TeamThe Data Platform team is developing, operating, and improving a highly scalable, robust, and resilient data infrastructure, which is the backbone of all data services, the central data warehouse, and our reporting & analytics infrastructure. As business needs are growing and becoming more erse, the team plans to increase our systems' scalability and introduce new services for a variety of use cases, ranging from core infrastructure and Data/DevOps tasks to advanced monitoring and anomaly detection. The team cooperates with the Analytics teams in the Data Department to maximise the business impact and works closely with the Jimdo infrastructure teams.
< class="h2">Our expectationsYou have 3 years of experience in one or more topics:
- Operating Linux or Docker
- AWS
- Software development (Java or python)
- Infrastructure as code (terraform, cloudformation etc)
- CI/CD pipelines
- Data related topics: Redshift, Snowflake, Airflow, dbt etc
- Design, build and operate a highly scalable data platform, further advancing our approach to designing robust, self-healing, resilient systems.
- Implement advanced monitoring and alerting with respect to the data infrastructure as well as the data, the data flows, and pipelines, this also includes anomaly detection.
- Ensure high test coverage and improve our QA and testing concepts with respect to the data pipelines and workflows.
- Educate and consult data & analytics engineers on designing, building, and operating maintainable, scalable, and reliable data services and workflows.
- Be responsible for the overall system's health of the data infrastructure.
- AWS
- Kubernetes / Docker
- Github-Actions / Terraform / Terragrunt / Atlantis
- Kafka
- Java / Python / Kotlin
- Airflow / DBT / Redshift / Tableau
- Jimdo's success is rooted in no small part in consequently using state-of-the-art cloud services. We are looking for engineers that have a solid grasp of cloud technologies and have a strong interest in distributed systems.
- Our data infrastructure and the services running on top of it ultimately contribute to the success of our several millions of customers and we believe that in the future data will play an even more significant role both for our users and for Jimdo. You fit right in if you share the same view about creating value from data and have experience building and operating great tooling for this purpose.
- We leverage different technologies and languages depending on the problem we try to solve, so we value people who are able to pick up new languages and tools when necessary and are able to find the right tool for the job at hand. Currently, we use e.g. Python and Java, but also some Ruby and Kotlin.
- You have excellent problem-solving skills. You use a systematic and thorough approach. You think from the first principles. You have a bias for action and know how to diagnose and resolve problems within complex systems.
Jimdo is proud to be an equal-opportunity employer. This means that we don't discriminate based on race or ethnic origin, color, the language(s) you speak, where you (or your parents) are from, or whether or not you consider yourself to have a disability. Neither will your age, gender, gender identity, sexual orientation, religion, beliefs, or political opinions play a part in your application with us. We're a erse team in so many ways, and we love it that way.
Vasiliki is looking forward to receiving your application.
By sending your application, you declare that you have read and understood the Jimdo Applicant Privacy Policy.
Trilateral is recruiting for a Data Protection Advisor, with expertise and experience in relation to data protection compliance, to join our Data Protection and Cyber-risk service. The candidate should have sound knowledge of European data protection legislation, including the General Data Protection Regulation (GDPR), EU Data Protection Regulation (EU DPR) national data protection legislation and e-Privacy legislation.
Our company and what we do
At Trilateral Research we feel passionately about making the world a better place by providing Ethical AI solutions to tackle complex social problems. With award-winning services in research, data protection and cybersecurity, ethics innovation and sociotech insights, our team takes an end-to-end approach that fully integrates the technical, legal and social science dimensions of a problem. Our ethical AI has a tangible impact in the fight against the biggest challenges we are currently facing such as modern slavery and human trafficking, climate action, child exploitation, human security in conflict and crisis settings, and many more.
Our Data Protection and Cyber-risk service supports these internal activities and provides consultancy services to public sector and private sector organisations independently. We serve some of the most well-known organisations in Ireland, the UK, Europe and beyond, assisting them to improve their data protection compliance maturity.
Key responsibilities
Executing data protection support work and data protection compliance efforts required by clients including:
- Providing outsourced DPO and DPO Assist services to clients and managing client relationships, including tasks such as developing consent and privacy notices, managing SARs and guiding clients through personal data breaches
- Data protection compliance activities including data protection impact assessments, compliance audits, gap analyses, data mapping services to aid in addressing risk management issues and complying with data protection requirements
- Writing briefs for clients on relevant topic areas, including newsletter pieces and other outreach material
- Attending and presenting at client meetings, as required
- Business development tasks, including developing proposals and bids for new projects
Skills and experience
- An LLM or other relevant postgraduate qualification
- Thorough working knowledge of Irish or UK and EU data protection legislation (including GDPR); knowledge of national data protection legislation for the US or other EU countries is a plus
- Knowledge and proven experience in the areas of some or all of the following:
- Data protection,
- Data privacy,
- Data governance,
- Auditing, compliance
- Information security and/or
- Risk management
- Ability to analyse legislative requirements and translate these into organisational practices and solutions
- Awareness of information systems and databases to determine how processes need to be developed
- Project management skills, a proven track record of leading projects to successful completion
- Proven client management skills and experience
- Experience working with public sector organisations is preferred but not essential
- CIPP/E or other data protection or information security compliance certifications preferred
- 3+ years of post-degree experience
About us
Our culture is based on delivering high-quality outputs, through our commitment and passion for what we do. We work in an open and collaborative environment where the team culture provides support amongst peers and colleagues. We believe in the strength of a erse, gender-balanced environment with positive work-life balance, and value the passion and talents of our team.
Find out about our people and culture, and see how our mission drives the research projects we take on, the key data protection services we provide, and the technology products and supplementary services we develop, by visiting our website Trilateral Research.
What else do I need to know?
Our compensation package includes:
- · Competitive salary
- · Flexible working hours
- · Remote working/working from home options
- · Competitive pension scheme (applies to permanent contract only)
- · Continuous career development
Location: This position is open to candidates based in the UK and Ireland
Contract type: Permanent, full-time Employment Contract
Salary: Commensurate with experience
Hours: Full time
How to applyPlease submit both your CV and a cover letter, linking your experience to our requirements in order to have your application considered. References will be required prior to appointment and candidates must be eligible to work in the UK/IE.
We are an Equal Opportunities employer and positively encourage applications from suitably qualified and eligible candidates, regardless of their age, sex, race, disability, sexual orientation, gender reassignment, religion or belief, marital/civil partnership status, or pregnancy and maternity. We are a Disability Confident committed and Living Wage employer.
We are looking for an Intermediate Data Engineer
Our client is a capital management company that develops quantitative arbitrage strategies and manages several of the worlds largest and highest performing crypto hedge funds. Thet operate at the intersection of technology and finance, with a team that comes from the best of both. They create high performance, mission-critical software that trades billions of dollars. Our client takes a risk neutral approach to the crypto industry that allows them to thrive even in a down market. Join a world class team and capitalize on your potential. Work in a results oriented environment, where your personal contributions will have a huge impact on the company's performance in real-time. Build systems that generate huge returns and build your own personal wealth alongside us. Realize your creative and earning potential by joining a small team that is creating the most high performance fund in the crypto world.
Job Overview
Our client is looking for a Data Engineer to join their team. As a Data Engineer, you will be responsible for expanding and managing their extensive data pipeline. Their organization is powered by dozens of integrations importings huge amounts of real-time data and processing it at high speeds. This powers their trading, accounting, and automated strategies.Responsibilities
- Expand and manage an extensive data pipeline
- Build systems to ingest and normalize huge real-time data sets
- Develop algorithms to transform data into useful, actionable information
- Build, test and maintain data pipeline architectures
- Create new data validation methods and data analysis tools
- Ensure compliance with data governance and security policies
Skill Requirements
- Bachelor's or Master's degree in computer science or mathematics
- Experience working with big data in a real-time setting
- Experience building and maintaining data infrastructure
Why work with our client
- Competitive salary
- Profit sharing, equity and bonus structures available
- Access to invest in our exclusive fund
- Work from anywhere in the world with office locations in Toronto and Bahamas
- Unlimited paid time off and ability to work within your desired lifestyle
Daxko powers health & wellness throughout the world. Every day our team members focus their passion and expertise in helping health & wellness facilities operate efficiently and engage their members.
Whether a neighborhood yoga studio, a national franchise with locations in every city, a YMCA or JCC--and every type of organization in between--we build solutions that make every aspect of running and being a member of a health and wellness organization easier and delightful.
< class="h3">Job DescriptionDaxko is seeking a senior software engineer with experience in C#, SQL, and Data to be a hands-on engineer on our Data & Analytics team. This engineer will work with Daxko’s business intelligence and data warehouse products. We are an Agile/Scrum shop with a strong sense of teamwork and collaboration, where teams are encouraged to operate with the autonomy and flexibility of startups, while enjoying the benefits of being part of an established company.
Your Responsibilities:
- Delivering high-quality, unit-tested code by practicing pragmatic software engineering principles
- Working with the team to size and groom the product backlog
- Collaborating with other teams across the organization (e.g., Product Management, Engineering, Data Science) to enable the better use and understanding of data
- Leveraging Agile best practices (vertical vs. horizontal development, breaking things down into smaller pieces, driving to done)
- Working with the team to own products and features end-to-end
- Responsible for providing actional feedback in code reviews
- Capable of leading system architecture and design reviews
- Participate in user story creating in collaboration with my team
- Guide team members to develop prototypes as necessary and validates ideas with a data driven approach
- Mentor team members in all aspects of the software development process
- Bachelor’s degree in an Engineering-based discipline or equivalent experience
- 2-5 years work experience programming with C# or Java
- Expert SQL skills and TSQL experience
- Experience with BI reporting tools
- Working knowledge of current .NET architectural/development best practices and design patterns
- Knowledge of available tools, technologies, methodologies, processes, and best practices to develop software
- Strong analytical/problem solving skills and a craving for details
- Be open to new technologies, industry trends and ability to adopt latest design methodologies
- Solid interpersonal skills and comfort in a collaborative development environment
- Ability to build positive relationships with internal and external stakeholders
Bonus Points for:
- Kimball methodology or dimensional models (star-schema)
- Columnar databases (e.g. Amazon Redshift)
- Realtime Data Streaming (e.g. Kafka, Kinesis)
- Serverless Interactive Query Solutions (e.g. Athena, Redshift Spectrum)
- Agile Methodology
- NoSQL databases
- REST Web Services
- Unit Testing Frameworks (e.g. Machine.Specifications)
Daxko is dedicated to pursuing and hiring a erse workforce. We are committed to ersity in the broadest sense, including thought and perspective, age, ability, nationality, ethnicity, orientation, and gender. The skills, perspectives, ideas, and experiences of all of our team members contribute to the vitality and success of our purpose and values.
We truly care for our team members, and this is reflected through our offices, benefits, and great perks. Some of our favorites include:
🏝 Flexible paid time off
⚕️ Affordable health, dental, and vision insurance options💪 Monthly fitness reimbursement🤑 401(k) matching🍼 New-Parent Paid Leave🏖 1-month paid sabbatical every 5 years👖 Casual work environments🏡 Remote workAll your information will be kept confidential according to EEO guidelines.
About ConsenSys Data
ConsenSys Data sits within Consensys Software Inc. to help address all our variants of data, break down silos, enable best practices, provide first rate resources, and accelerate our mission of becoming a cutting edge data driven organization. We are using a mix of providing some centralized data engineering functions as a shared service, while enabling our business units to make great data decisions with their own data functions.
ConsenSys Software Inc. is a wide organization with each inidual business unit facing unique data challenges. Infura needs to provide real time analytics on top of a data pipeline doing billions of events per day. MetaMask Swaps needs to provide financial accounting for a purely on chain data source. Truffle needs to track developer engagement across the open source ecosystem.
We are looking for a data engineer to join our shared data engineering team, with a goal of helping to build, maintain and evolve our data warehouse to support the organization.
You will join a team to work with the business to ensure we have a first class data warehouse supporting our business units and our business decision making. Some of the key areas you will help ensure we are doing well are:
- data quality
- data governance
- master data management
- Understanding of the business and data strategy
- Contribute to the collection, storage, management, quality, and protection of data
- Implementing data privacy policies and complying with data protection regulations
- Effectively communicate the status, value, and importance of data collection to executive members and staff
- Knowledge of relevant applications, big data solutions, and tools
- Knowledge of real time streaming data pipelines
- Governance: Advising on, monitoring, and governing enterprise data
- Operations: Enabling data usability, availability, and efficiency
- Innovation: Driving enterprise digital transformation innovation, cost reduction, and revenue generation
- Analytics: Supporting analytics and reporting on products, customers, operations, and markets
Who we’re looking for:
- 5+ years of work experience in an enterprise engineering domain
- Excellent problem-solving skills and sharp attention to detail
- Solid written and verbal communication skills
- Proficient in Python programming and well versed with popular frameworks like Pandas, Flask, Airflow, Apache Beam etc.
- Experience with any one cloud technologies i.e GCP, AWS or Azure (GCP preferred)
- Hands on experience with Cloud function, Big Query, DataFlow, Cloud Run, Airflow etc.
- Strong with Linux Commands and Shell Scripting
- Experience with Docker & Kubernetes
- Experience building CI/CD pipelines
- Knowledge of any one SCM tools like Git, BitBucket etc.
- Strong knowledge on Terraform scripting
- Strong skills with writing SQL
- Enthusiasm for shipping high-quality code and helping peers do the same
- Proactiveness and be self-driven to be successful working in a remote environment
- Understanding of web development practices and terminology
- Relevant knowledge of cloud security and data security
- A belief in our mission and values
Bonus points:
- Blockchain expertise
- Hands-on experience with Kafka
- Hand-on experience with Apache Spark (py-spark preferred)
- You’re a MetaMask user!
Makeship exists to empower influencers, creators, and brands of all sizes to develop and launch limited edition products that matter to their fans. Leveraging our design, manufacturing, and marketing expertise, we work with our partners to bring their product to life through our community-powered crowd-funding platform. Each product is given a window of 21 days to be funded by the community before we produce and ship to fans worldwide. We put our brand behind every product and guarantee quality and ethical sourcing. We're profitable, have grown the team from 3 to 60+ people in under 3 years, and we’re growing at an average annual growth rate of 400%+.
< class="h3">About the RoleAs an early member of the Data Team, you’ll play a huge role developing and architecting a scalable data infrastructure for Makeship. Data is one of our most important assets, and you will enable better decision-making for all teams across the company. Every day, you’ll collaborate cross-functionally, learn more about the content creation space, and watch your work make a measurable impact.
We want this to be the best work experience of your life, so we’ll pay you well, offer great benefits, and invest deeply in your personal growth.
< class="h3">Why this Role?- Have a massive impact on the company. As our first database engineer hire, there will be plenty of opportunity for you to lead and implement technical projects. Your contributions will be felt right away, and will affect how we interpret data for years to come!
- Develop and architect significant changes. You will have the freedom to be creative and architect/develop a scalable database. This will challenge you to think about your design and bring your own ideas to life.
- Join us at an epic time. We’re a profitable and growing startup with millions in revenue. We’ve bootstrapped the company from 2 to 40 employees in 3 years. Join us and experience exponential personal and career growth!
- Build infrastructure for extraction, transformation, and loading of data from 10+ sources
- Design, develop and maintain data integration pipelines for growing data sets
- Collaborate with all teams to take requirements from prototype to production
- Build data validation testing frameworks to ensure high data quality and integrity
- Write and maintain documentation on data pipelines and schemas
- Create dashboards to provide data-driven business decisions
- At least 3 years of relevant work experience in data engineering and/or analysis
- Interest in the content creation space: art, animation, gaming, and entertainment!
- Proven experience in building data systems and ETL
- A solid understanding of relational databases, data storage, and data manipulation
- A strong ability wrangle data and find answers complex real-world questions
- Experience with cloud native infrastructure (AWS, Docker, Kubernetes, etc.)
- Experience with BI tools like Google Data Studio, Power BI, etc
- Excellent communication with both technical and non-technical stakeholders
- An ability to gather information and requirements yourself, in a fast-paced environment
Experience with our technology stack (PostgreSQL, BigQuery, Kafka, AWS, GCS)
- Advanced skills in data scripting and database development technologies (SQL, Python, R)
- API expertise for Google Analytics, Facebook, Twitter, etc.
- Experience building data pipelines and warehouses.
- Worked with data from Shopify, Hubspot, or Airtable APIs.
- Prior experience building and scaling ML solutions
- Worked in a startup or similar environment.
- Work remotely anywhere in Canada and/or access any of our hubs
- Health and dental benefits
- 3 weeks of paid vacation
- 1 week of paid time off during the holidays
- 2 mental health and wellness days
- Paid time off on your birthday
- Monthly phone allowance
- $400 home office setup allowance
- Pregnancy and parental leave top-up program
- Learning and development opportunities
- Employee referral program
Makeship is committed to creating a erse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.
Fraym is mapping humanity to help governments, organizations, and companies solve some of the most pressing challenges of our time: inequity and insecurity, climate vulnerability, public health, access to critical services, and more. To us, mapping humanity is as much about cutting-edge tech as it is about the people that live in the places we illuminate. We believe community-level data is a critical requirement for bringing about positive change.
Summary of Position
Fraym is seeking a data architect to support data management and enable data science workflows at scale. Your contributions will support decisions in emerging markets across commercial, international development, and intelligence sectors.
You will be part of a team responsible for implementing Fraym’s data management strategy and will play a critical role in scaling our existing solutions. In addition to implementing core data infrastructure components and managing cloud resources, you will be responsible for leading Fraym’s future data architecture.
You should have a strong background in data modeling and experience building and maintaining cloud-based data systems and applications. We are looking for someone who can design and manage creative solutions to managing erse and disparate data. Preference will be given to applicants with experience in cloud security and infrastructure as code.
Your responsibilities will include, but are not limited to, the following:
- Designing and building upon existing AWS-based data management systems that integrate household surveys, satellite imagery, and other spatial data
- Managing data environments for developing and testing data pipelines and machine learning experiments
- Leading project teams of data scientists to outline data requirements for tools that simplify internal data discovery and analysis
- Providing support for cloud processes and standards across the company
- Mentoring team members in technical areas of expertise, including data architecture and cloud infrastructure
- Collaborating with business development and client-facing teams to expand external data delivery options
You will have the following qualifications and skills:
- Interest in or passionate about Fraym’s mission
- Commitment to ersity, equity, and inclusion
- At least 2 years of data architecture and/or data engineering work; preference will be given to applicants with practical experience building and maintaining cloud-based data management systems
- Essential skills: Experience with cloud computing environments (particularly AWS), database management, containerization, and Python
- Experience or ability to collaborate within a distributed team and communicate effectively with colleagues of different technical backgrounds
- Ability to develop skills quickly, learn new tools, and solve problems independently.
- Bonus Points for experience working with infrastructure as code (especially Terraform), cloud security best practices, permissions management, and orchestration tools
We will be reviewing applications for this position beginning Mid-September.
Forefront Telecare Inc. (now part of Access Telecare) is a rapidly growing Behavioral Health Telecare company that provides critical care services to patients in Hospital Psych Units, Hospital Emergency Departments, Nursing Facilities, and their homes.
At this time, we are looking for a Data & Analytics Specialist to help build our next generation enterprise performance management and reporting. This position will report to our Chief Operating Officer.
Responsibilities
Interpret data, analyze results using statistical techniques and provide ongoing reports
Acquire data from primary or secondary data sources and maintain management and performance reporting
Identify, analyze, and interpret trends or patterns in complex data sets
Filter and “clean” data by identifying issues, sources of issues and fixes
Work with management to prioritise business and information needs
Locate and define new process improvement opportunities in management and performance reporting
Skills
Three plus years of working experience as a data analyst or business data analyst
Technical expertise regarding data models, database design development, data mining and segmentation techniques
Strong knowledge of and experience with business intelligence and reporting tools, especially Microsoft Power BI
Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel etc,)
Knowledge of and interest in the healthcare ecosystem and the role of behavioral health
Experience in healthcare ecosystem systems and tools including CRM and EMRs. Experience of using CRM systems such as SalesForce and EMRs as sources for reporting a plus
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
Adept at queries, report writing and presenting findings
General
BS in Mathematics, Economics, Computer Science, Information Management or Statistics
Only U.S, based applicants can be considered at this time
Constructor.io powers product search and discovery for the largest retailers in the world, like Sephora and Backcountry, serving billions of requests every year: you have most likely used our product without knowing it. Each year we are growing in revenue and scale by several multiples, helping customers in every eCommerce vertical around the world.
We love working together to help each other succeed, and are committed to maintaining an open, cooperative culture as we grow. We get to the right answer with empathy, ownership and passion for making an impact.
< class="h3">Merchant Intelligence TeamAn important part of our product is the Customer dashboard that helps merchandizers to analyze and impact user behavior. We provide a number of tools for them to influence which products and product attributes should and will receive more attention to create value for their business. The goal of the Merchant Intelligence team is to:
- Help merchandizers achieve their e-commerce goals, increasing satisfaction with their sites and retention of customers.
- Provide insights to merchants they can't get anywhere else.
- Become a critical point of merchant team planning, decision making, and evaluation so we become a sticky part of their organization.
- Deliver new reports and tools for merchandizers and analysts from e-commerce companies.
- Improve the existing dashboard experience by building analytics that provide insights to improve KPIs.
- Perform data exploration and research user behavior.
- Implement end-to-end data pipelines to support realtime analytics for important business metrics.
- Take part in product research and development, iterate with prototypes and customer product interviews.
Requirements
- You are proficient in BI tools (data analysis, building dashboards for engineers and non-technical folks).
- You are an excellent communicator with the ability to translate business asks into a technical language and vice versa.
- You are excited to leverage massive amounts of data to drive product innovation & deliver business value.
- You're familiar with math statistics (A/B-tests)
- You are proficient at SQL (any variant), well-versed in exploratory data analysis with Python (pandas & numpy, data visualization libraries). Big plus is practical familiarity with the big data stack (Spark, Presto/Athena, Hive).
- You are adept at fast prototyping and providing analytical support for initiatives in the e-commerce space by identifying & focusing on relevant features & metrics.
- You are willing to develop and maintain effective communication tools to report business performance and inform decision-making at a cross-functional level.
- Stack: python, numpy, pandas, SQL, pyspark, flask, docker, git
Benefits
- Unlimited vacation time -we strongly encourage all of our employees take at least 3 weeks per year
- A competitive compensation package including stock options
- Company sponsored US health coverage (100% paid for employee)
- Fully remote team - choose where you live
- Work from home stipend! We want you to have the resources you need to set up your home office
- Apple laptops provided for new employees
- Training and development budget for every employee, refreshed each year
- Maternity & Paternity leave for qualified employees
- Work with smart people who will help you grow and make a meaningful impact
Diversity, Equity, and Inclusion at Constructor
At Constructor.io we are committed to cultivating a work environment that is erse, equitable, and inclusive. As an equal opportunity employer, we welcome iniduals of all backgrounds and provide equal opportunities to all applicants regardless of their education, ersity of opinion, race, color, religion, gender, gender expression, sexual orientation, national origin, genetics, disability, age, veteran status or affiliation in any other protected group.
About Us
Ever since the global privacy landscape changed in the last few years, having first-party data has become a requirement for e-commerce businesses. Northbeam steps in to provide our clients with the solution for their data needs, and it pays off. As a result, we’ve been seeing some unprecedented growth. We need your help in order to build the product that’s going to power the great ecommerce brands of the future and today.
We’re a remote-first company with team members in San Francisco, Los Angeles, New York, and more.
About the Role
You can expect to work on the following:
- Media mix modeling: understanding the relationship between our clients' marketing inputs and marketing outputs (sales, revenue), as well as quantifying the inherent uncertainty involved
- Communicating the model and its implications to non-technical stakeholders
- Working with our data engineering and product teams to productize the above for our growing customer bas
Requirements
- Expertise with Python
- Strong skills in statistics, probability, and/or machine learning
- Experience with differentiable programming or probabilistic programming (e.g. Tensorflow, Pytorch, JAX, Pyro, PyMC)
- Ability to clearly communicate results to non-technical audiences, in English
- Experience (or enthusiasm) with working in a small, fast-paced company
Good to Have:
- MS or PhD with research focus on quantitative or statistical methods
- Experience working with marketing data
Values:
- Growth mindset - we’re always learning and growing
- Customer focus - we want to make the customer happy with our product
- Ownership mentality - we think like owners in the business
- Radical candor - we’re transparent and give direct feedback to one another
- Empathic disposition - we’re kind to one another and help each other grow
Benefits
- Significant equity package
- Generous base salary
- Healthcare Benefits (medical, dental, vision)
- Travel to meet with the team
As a leader in process intelligence technology, Skan leverages computer vision and machine learning to help organizations be greater at business by triangulating insights from people, processes, and technologies to accelerate business transformation.
Location
For this position, we are looking for remote candidates located in the United States and Canada.
As a Sr. Data Analyst, you will be responsible for building, modulating, and understanding the data produced by the tool to efficiently transform the data into customer-friendly dashboards which show business value insights as part of Skan implementation for our prospects and current customers. You will have the opportunity to use data to drive business decisions, write SQL queries, data mine and visualize techniques for our customer. You will provide crucial insight into business decisions and KPI’s and metrics for our customers.
What You’ll Do:
- Participate in value discovery workshops and help in documenting the statement of value
- Design new reports with a value centric approach
- Generate / extract data from our BI databases to support e deep activities.
- Prioritize, optimize and deliver timely updates for metrics.
- Work with the customer success team and value management teams to understand business challenges and provide data/analysis to help drive projects.
- Work with data to arrive at key metrics and KPI’s for our customers. E.g. NPS (Net Promoter Score), FPR (First Pass Rate) etc.
- Create dashboards with high level of creativity to bring out powerful business value through insights and stories and present to internal stakeholders
- Uncover trends and correlations through data mining and analysis to develop insights that can improve the business and help make effective decisions.
- Uncover trends and correlations through data mining and analysis to develop insights that can improve the business and help make effective decisions.
- Building dashboards and tools that will allow us to meet the success criteria and metrics for projects
- Analyze unique operational data to find new and deep insights that brings our customers' value
- Utilize Power BI and experiment with infographics to make our rich data as actionable as possible
- Collaborate with product and data sciences team to ensure we are utilizing full capability of the platform
- Standardize and help build out our report library
- Present value based insights to internal stakeholders
Requirements
What You’ll Bring
Need to have
- Bachelor's degree in engineering or similar technical field
- 7 years of Business Intelligence or Data Analytics experience
- 3+ years of experience with Snowflake and Microsoft Power BI
- 4+ years’ experience working in a data-specific role with a sound knowledge of database management, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing
- In-depth understanding and experience with Microsoft BI stacks such as Power Pivot, SSIS, SSRS, and SSAS.
- Experience using Python to parse, structure, and transform data
Nice to have
- Strong analytical skills and ability to e into dashboards and queries
- Strong communication, organizational and interpersonal skills
- Ability to multitask and handle various priorities in a fast-paced environment
- Experience working at a fast-growing tech start-up / B2B SaaS
- Cross-functional leading experience
Benefits
- 100% coverage of Health Care Plan (Medical, Dental & Vision)
- Life Insurance (Basic, Voluntary & AD&D)
- Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
- Short Term & Long Term Disability
- Remote position
Case Coordinator
locations Remote – US
time type Full time
job requisition id R29675
Change Healthcare is a leading healthcare technology company with a mission to inspire a better healthcare system. We deliver innovative solutions to patients, hospitals, and insurance companies to improve clinical decision making, simplify financial processes, and enable better patient experiences to improve lives and support healthier communities.
Case Coordinator
Change Healthcare is a leading healthcare technology company with a mission to inspire a better healthcare system. We deliver innovative solutions to patients, hospitals, and insurance companies to improve clinical decision making, simplify financial processes, and enable better patient experiences to improve lives and support healthier communities.
Work Location: Remote US
Position:
Responds to in-bound, routine customer telephone inquiries regarding products, services, order status, and other general questions. Typically uses scripted dialogue and may escalate inquiry to product support, billing, sales or return/repair. Logs calls and updates customer account records. At higher levels, may be asked to provide responses to submitted questions throughout-bound calling.
Core Responsibilities:
- Achieve daily, weekly, and monthly quality and production goals.
- Adhere to call handling process.
- Adhere to call quality standards by ensuring proper phone etiquette and adherence to scripts, make accurate and descriptive MMS documentation.
- Follow-up with members as assigned.
- Adhere to assigned schedule.
- Completion of all Change Healthcare required courses on University (CHU).
- Review all correspondence sent by Change Healthcare and management with current information of state and federal regulatory requirements.
- Review all correspondence sent by Change Healthcare and management with updates on the organization.
- Comply with all company and department operational guidelines and policies.
- Participate in Change Healthcare staff and operational development programs as assigned.
- Performs other duties as assigned
Requirements:
- Bilingual (English/Spanish).
- 3-year External Customer Service/Call Center Internal 3 years
- Education: High School Diploma or Equivalent
- Excellent Problem-solving skills
- Excellent Time Management
- Business demeanor and skills with the ability to communicate effectively (verbal, written and listening skills).
- Promote company services in an outbound call center (production driven) environment.
- Professional behavior with courteous, polite, and energetic qualities.
- High commitment to accuracy, high quality work, and detail oriented.
- Must be driven and motivated to exceed inidual and team goals.
Preferred Qualifications:
- Excellent data entry and internet navigation skills
- Ability to work independently.
- Demonstrates patience and empathy.
- Maintain a confident, helpful, and positive tone on all calls
- High commitment to accuracy, high quality work, and detail oriented.
- Must be driven and motivated to exceed inidual and team goals.
- Able to learn and adapt to changing environments, applications, and software.
- Basic knowledge of Microsoft Team, Word, Excel, and PowerPoint.
Working Conditions/Physical Requirements: information.
- General office demands
- This is a remote position. Required to have a dedicated work area established that is separated from other living areas and provides information privacy.
- Must live in a location with internet connection.
Unique Benefits*:
- Flexible work environments
- Ready, Set, Grow Career Development Center & access to Change Healthcare University for continuous professional learning & development with more than 5,000 training assets
- Volunteer days, employee giving and matching gifts programs, community awards and dollars for doers, community partnerships
- Employee wellbeing programs and generous health plans
- Educational assistance programs
- US 401(k) or Group RRSP (Canada) savings plans with matching employer contributions
- Be sure to ask our Talent Advisors for more information on location specific benefits and paid time off policies
- Learn more at https://careers.changehealthcare.com
- *Eligibility for some benefits may be limited or not available for part-time employees, be sure to speak with your Talent Advisor.
Diversity and Inclusion:
- At Change Healthcare, we include all. We celebrate ersity and inclusivity, respect each other and value our unique experiences. By being our authentic selves, we bring different perspectives into our work and relationships.
- Business Resource Groups (BRGs) play a central role in advancing ersity and inclusion at Change Healthcare. They deepen our understanding of different cultures, people, and experiences, and help foster an inclusive workplace.