Work With Us

We help companies reach their full potential. Are you ready to achieve yours? Join us.

We Can't Wait to Meet You.

We’ve been around for over five years, but you wouldn’t guess it from the energy in the place. Along the way, we have made our company a great place to work.

Current Openings

Feel fulfilled. Have fun. Help us to shape the future.


Create requirements traceability matrix and Test plans and Test cases from business and functional requirement documents. Design policies, standards, and procedures for testing life cycle for the data warehouse and datamarts. Test the ETL mappings from Source to Target. Validate defects and resolve critical issues. Prepare Exit reports, Test Execution status reports and graphs. Skills required: Teradata, Hadoop, Snowflake, SalesForce, TOAD, Java, Oracle, Jira Tools, Informatica, and TOAD/ SQL Developer. Bachelor’s degree in Science, Technology, Engineering, or any related field with 2 years of experience in the job offered or related occupation is required. Work location: Suwanee, GA and various unanticipated locations throughout the U.S.


Send Resume to HR Dept., XL Softek, Inc., 4255 Johns Creek Parkway, Unit A, Suwanee, GA 30024.


Should the candidate accept employment with XL Softek, Inc., the referring employee will be eligible to receive an award of $1,000.00 for the successful referral.

Location : Various
Type : Full Time

AWS DevOps Engineer:

Requirements: At least a minimum of a Bachelor’s Degree in Computer Science or a related technical field, is required to perform the duties.

Job Duties:

  • Specialist in application and Infrastructure migrating into physical server environment to AWS Cloud based virtual environment. Designed and deployed multitude applications utilizing most of the AWS services (Including EC2, Route53, S3, RDS, SNS, IAM) focusing on high-availability, fault tolerance, and auto-scaling.
    • Used GZIP with AWS CloudFront to forward compressed files to destination node/instances.
    • Designed MapReduce workflow with AWS EMR and utilized SNS and assigned ARN to S3 for object loss notifications.
    • Created users and groups using Identity Access Management (IAM). Assigned individual policies basing on compliance and tagging requirements to each group.
    • Integrated CloudWatch alarms into PagerDuty Services to automatically create incidents and slack alerts.
    • Created S3 backups using versioning enable and moving objects to Amazon Glacier for archiving purpose.
    • Created classic ELB’s and used Route53 with failover, latency options for high availability and fault tolerance.
    • Configured Security group for EC2 Windows, Linux instances and for puppet master & various puppet agents.
    • Increased EBS backed volume storage capacity when the root volume is full using AWS EBS Volume feature.
    • Leveraged AWS security groups, NACLs, Internet GW’s, NAT instances to ensure a secure zone for the organization.
    • Created various custom dashboards, metrics for memory utilization, CPU utilization, alarms and notifications for EC2 hosts using AWS Cloud Watch.
  • Expert in Groovy, Shell and Python scripts to automate jobs and configured them to run periodically in Jenkins.
    • Written a Groovy Script to approve 1000+ pending signatures/hashes in Jenkins all at a time between Job migrations between production environment.
    • Developed a Shell script that automates to decrypt & store the log files into Amazon S3 using python boto.
    • Written shell script to take the Jstack thread dumps to collect systat, vmstat, iostat and top output samples every 60sec for a 5min intervals during the performance issues with production application server; This thread dump will be provided to Atlassian, Cloudbees and JFrog support during server outages.
    • Written a Groovy Script to remove the offline build agents/slaves from the Jenkins console; The script will check only dynamic slaves instead on static, delete them and provide the exit status to the users.
    • Experienced in Bash Shell Scripting and created Cron Jobs for repetitive tasks on linux machines.
    • Created Shell and Python scripts for system administration and manual processes in AWS via command line.
    • Written a Groovy Script to clear the pending/hung jobs Jenkins Master build queue waiting for invalid slave labels or user inputs.
  • Specialist in building an end to end CI/CD Pipeline in Jenkins to retrieve code, compile, perform tests & push build artifacts to JFrog Artifactory.
    • Written the code and stored the source code in GIT version control system.
    • Used Atlassian Bitbucket Server to commit and push changes from the GIT Client to Bitbucket
    • Pulled the code from Bitbucket to Cloudbees Jenkins using Git plugin’s and build the code using tools like maven and ant.
    • Integrated Sonarqube with Jenkins and bitbucket for the dynamic code analysis and set quality profiles and quality gates for the team.
    • Deployed build artifacts during the jobs into the JFrog Artifactory and tagged artifacts by build number.
    • Experienced in branching, build and release strategies in GIT for different projects.
    • Developed build and deployment standards with inputs from development, operations, and IT security.
    • Used ANT and MAVEN as build tools on Java projects for the development of build artifacts.
  • Proficient in automating the customer onboarding process across the Continuous Integration and Continuous Delivery applications.
    • Used the http methods PUT, POST, GET, DELETE, etc., to call the rest api’s for various processes done via UI.
    • Created a Jenkins job with parameters and configured it to pass Project Name & Keys arguments for Jenkins, Bitbucket, Sonarqube and Artifactory applications.
    • Scheduled the job to run periodically to read the customer intake forms and create the new projects accordingly across the CICD applications.
    • Configured each project permissions exclusively to the customer teams depending on their AKMID.
    • Worked with the customers in resolving the integration issues between their Jenkins Jobs across other platforms.
    • Installed and configured various community and tier 1 plugins and Integrated Atlassian Bitbucket with Jira using OAuth Authentication, for linking Jira tickets for code commits.
  • Expert in building docker images depending on the team’s requirement and deployed them to artifactory server:
    • Automated the docker image creation by configuring a Parameterized Jenkins jobs with parameters.
    • The parameterized job will take a json file input, build the Dockerfiles and store them Bitbucket repo and the downstream job will build the images and push them to Jfrog artifactory.
    • Written and built custom Dockerfiles to create docker images from scratch. Developed alpine linux and centos images and used them as docker containers.
    • Installed a private Docker Registry for local upload and downloaded Docker images from Docker hub.
    • Installed and Upgraded node, npm, maven, angular, and cloudfoundry etc., packages on docker images.
    • Virtualized traditional servers leveraging Docker Containers for test and development environment needs.
  • Proficient in configuring Static and Dynamic slaves in Jenkins masters to use as build agents for customer jobs.
    • Launched EC2 instances using Cloud Formation and Terraform templates for automated cloud deployments.
    • Configured Windows, Mac & Linux instances as static slaves in Jenkins to offload the load from Jenkins master and build the customer jobs to be run on these slaves.
    • Configured the dynamic slaves in Jenkins leveraging Kubernetes, Apache Mesos and marathon environments.
    • Automated the installation process of Jenkins Slave from launching the EC2 instance using CloudFormation templates and configured Jenkins home directory.
    • Deployed, scaled, managed and upgraded Kubernetes applications by using helm.
    • Troubleshooted various Jenkins issues together with Cloudbees team in the production environments.
    • Configured a Jenkins Job to continuously monitor the status of slave nodes and relaunch them if disconnected.
    • Worked with Atlassian Bitbucket teams in resolving “remote hung-up” issues for the customer teams.

Location : Various
Type : Full Time

Software Developer:

Design and deploy custom tabs and validation rules. Design and develop classes, triggers and various functional needs in application. Implement test classes for code coverage. Develop applications and enhance validations, workflows, security settings, reports and dash boards. Create workflows and perform data clean up operations. Skills required: Java, JavaScript, Jira, Eclipse, Oracle, MS SQL, Ajax, CSS, and Apex. Master’s degree in Science, Technology, or Engineering (any) is required. Bachelor’s degree in the above fields along with 5 years of experience in the job offered or related occupation is acceptable in lieu of Master’s degree. Any suitable combination of education, training or experience is acceptable. Work location: Alpharetta, GA and various unanticipated locations throughout the U.S.

Send Resume to HR Dept., XL Softek, Inc., 4255 Johns Creek Pkwy, Unit A, Suwanee, GA 30024.

Should the candidate accept employment with XL Softek, Inc., the referring employee will be eligible to receive an award of $1,000.00 for the successful referral.


Here are just some of the perks you’ll get as a member of XL Softek.

Health Care

No-premium health, dental, and vision benefits, 401K, FSA


Flexible work hours, unlimited vacation, and paid parental leave


Healthy catered lunches, dinners, and drinks on site


Board games, team outings, company parties, and more


Monthly wellness stipend (gym, work-out gear, etc.)

Join XLsoftek

We make your business better, secure and more productive?

Leave your technology worries on us. Focus on your business. Let us help you for the growth you deserve.