Join the Revolution

Why take a job when you can grow your career?

We’re expanding globally and always looking for talented, exceptional employees to add to our expertise. Browse and apply for one of our many open positions below and embark on your professional journey. Learn more about life at AllCloud.

Data Solution Engineer

Romania · Full-time

About The Position

Are you passionate about data and delivering solutions for clients that turn data into valuable, actionable information for their business?

This is a newly created role to work as part of the Office of the CTO team to incubate and launch tech-enabled Data Analytics services that combine code and process-based Intellectual Property (IP) to uniquely address common use cases and differentiate AllCloud from others. Each offering within the Data Analytics portfolio of services will need to be clearly positioned against a need, consistently delivered, and directly provide value to our clients in the form of increased revenue, improved margins, or competitive advantages.

We’re hiring a Data Solutions Engineer with solid experience across the entire modern data stack - ETL/ELT, Data Warehousing, BI, and AI/ML. The ideal candidate will have extensive experience creating data warehouses, writing highly efficient transformations using SQL and other languages, being comfortable with standard version control and code promotion practices using git, experience with data acquisition tools (such as Matillion), and experience with a coding language (Python/JavaScript). Experience with modern cloud data warehouses (such as Snowflake) and ML models would be highly desirable. Excellent problem-solving ability in dealing with large volumes of data will round out the ideal candidate.

Summary of Key Responsibilities

The Data Solutions Engineer will be researching and designing new offerings and features, creating hands-on proof of concepts (POCs) to demonstrate the offering, and assisting the delivery team in fully implementing and deploying the first versions. These offerings may be focused on one area of the cloud data analytics stack or several areas. This role will need to be an expert in one or more areas and a generalist in the other areas. It is expected that this role will be a combination of hands-on development and coordination with those with complementary domain experience.

Although the actual needs may change, the following are examples of past and future needs:

  • Custom connector within Matillion to load data into Snowflake
  • Reusable RBAC script to setup and manage security within a Snowflake instance
  • Data Mart data model to model Health Care KPIs
  • A predictive model to determine what is affecting quality in a manufacturing plant 
  • Dashboard displaying KPIs for the Retail/CPG industry based upon the Salesforce/Snowflake Data share 
  • Migration tool to convert Business Intelligence reports between BI tools
  • You will interact with other technologies based on the client’s request.


  • Bachelor’s degree, or equivalent experience, in Computer Science, Data Science, Mathematics, or a related field. Commensurate work experience will be considered in lieu of a degree
  • Experience supporting scalable Cloud data solutions using MPP Data Warehouses (Snowflake, Redshift, Delta Lake, or Azure Synapse), data storage (AWS S3 or Azure Blob Storage), and analytics platforms (i.e. Spark, Snowpark, Databricks, etc.)
  • Demonstrated ability to communicate data insights and model results in business terms and articulate business value
  • Demonstrated ability to work with any variety of data for the purpose of retrieval, movement, transformation, and storage
  • 3+ years developing and deploying scalable data warehouse/data marts by incorporating standard methodologies such as Dimensional modeling and DataVault 2.0.
  • 3+ years developing and deploying GUI-based data integration data pipelines (Matillion, SSIS, Talend, Informatica, Kettle, AbInitio, ODI)
  • 3+ years supporting the needs of Business Intelligence teams that use Tableau, PowerBI, Sigma, or similar tools
  • 3+ years working with any relational or non-relational data platform(s)
  • 3+ years working with business SMEs and end-users to understand and document their needs
  • 2+ years working in a Public Cloud (AWS, Azure, or GCP)
  • Intermediate or Better Knowledge of at least one (1) scripting or OOP programing language related to data (Python, JavaScript, Shell)
  • Expert Knowledge of SQL in any format
  • Experience utilizing SCM tools such as GitHub/GitLab/BitBucket
  • Previous experience creating reusable processing code

Nice to Have Skills:

  • Hands-on Matillion experience
  • Hands-on Snowflake experience
  • Snowflake SnowPro Certification
  • Python, JavaScript, and Shell (Bash, ksh, or zsh) experience
  • Experience with dbt
  • GitFlow branch methodology experience
  • Data Vault 2.0 table structure and load patterns
  • Health Care, Retail/CPG, or Manufacturing experience
  • Experience with hands-off production promotions
  • At least one (1) streaming framework (Kafka, Spark Stream, etc.)
  • Predictive modeling experience
  • Software development experience
  • PySpark and SparkQL experience
  • Experience with 1 audit log-based CDC Tool (AWS DMS RIs, GoldenGate, IDMC DBMI, IDMC Power Exchange CDC, Fivetran, Attunity, HVR)
  • Technical documentation of solutions

Why work for us? 

Our team inspires progress in each other and in our customers through our relentless pursuit of excellence; you will work with leaders who promote learning and personal development.


AllCloud is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics, or any other basis forbidden under federal, provincial, or local law.

Apply for this position