Courses With Realtime Job Description Assignments | Online Homework Help
June 24th, 2019
I have 16 master and Ph. D level courses. Where you must write each course how that is related to job description (given below) and in general how that is related to software developer.
Don't use plagiarized sources. Get Your Assignment on
Courses With Realtime Job Description Assignments | Online Homework Help
Just from $13/Page
Write at least 120-150 words and Co-relate the below courses with given job description also you can use in general Software developer/Data Scientist duties. Each course from below list must address properly with real life work.
- Access Control
- Physical Security
- Information Security and Risk Management
- Business Continuity Planning and Disaster Recovery Planning – Barman
- Application Security
- Security Architecture and Design
- Operation Security
- Legal Regulations, Compliance and Investigation
- Cryptography
- Telecommunications
- Data Science and Big Data Analytics
- Emerging Threats and Countermeasures
- Information Governance
- Enterprise Risk Management
- IT Importance in Strategic Planning
- IT in Global Economy
Job Description: –
- Responsible for being the technical point of contact to upper management, business analysts, project management, and miscellaneous other groups for the proactive monitoring project.
- Created and maintaining probabilistic matching rules in Mirth Match EMPI and use Match’s match-quality reporting to rate matching effectiveness and look for areas for improvement.
- Extracting and analyzing raw data from Match’s PostgreSQL database and Serving as SME for IHE PIX/PDQ feeds with other exchange participants.
- Created tooling needed to perform data-steward activities against Mirth Results’ clinical data repository (stored in PostgreSQL). Such activities include Detecting missing values from contributed systems and suggesting ways to solve and prevent the issue.
- Detecting corrupted values from contributed systems and suggesting ways to solve and prevent the issue
- Responsible for SQL Server Reporting Services Planning, Architecture, Training, Support, and Administration in Development, Test and Production Environments.
- Analysis between Mirth result and mirth match data to find the duplicates patients (Demographic) records and finding the duplicates from Mirth match threshold score.
- Preparing and exporting an ad-hoc report from the different databases for Testing scenarios and discovering areas where a potential for data standardization exists
- Responsible for enabling analysis through producing information products and is involved in the research and development efforts. Traditional programming (SAS, SQL, R, and PostgreSQL) and business intelligence (i.e. ELK) experience for creating dashboards
- Determining which feed and Mirth Results data are useful for metrics and developed Mirth Connect channel to automate metric-shipping feeds to Elasticsearch
- Creating Individual message count visualizations using Kibana to show messages statistics over a selected period and utilize reporting via Kibana
- ETL process for continuously bulk importing clinical data from SQL server and PostgreSQL Database into Elasticsearch and into Spreadsheet.
- Indexing and search/query substantial number of documents inside Elasticsearch and created a Kibana dashboard for sanity-checking the data and Working with the Kibana dashboard for the overall build status with drill-down features
- Data Discovery, visualizations, and dashboards are created in Kibana for quick analysis on healthcare data.
- Developing Dynamic Mirth Connect channel with the help of JavaScript, XML, SQL Server to import data from SQL Server and PostgreSQL database to ELK and performed different visualization as per the business requirement with the proper mapping.
- Created Kibana dashboards for metrics which need to be trending for operational and business activities (message volume, ordered tests, etc.)
- Deployment of advanced techniques (e.g., text mining, statistical analysis, etc.) to deliver insights
- Created an ODBC connection for Excel to pull the data from PostgreSQL and SQL Server.
- Developed a Mirth Connect channel to Import the Mirth Connect statistics matrices for given interval time to Timescale DB in PostgreSQL database for creating Grafana Visualization.
Experience in applied SAS/R programming and/ SQL/PostgreSQL programming in complex RDBMS data structures.
- Perform Various Statistical analysis using SAS tools and Excel.
- Created and extracted Clinical data tables from MSSQL Server, Text, CSV, and Excel files to SAS using SAS tools like SAS/ACCESS, Infile, LIBNAME engine.
- Utilize SAS tools to analyze, validate, format and consolidate raw data
- Use SAS effectively to import and export raw data from one file formats to another data file format.
- Implemented a waterfall method of Software Development Life Cycle (SDLC) methodology for design, development, and implementation and testing of various SAS modules.
- Used SAS and SAS tools on the Clinical trials data to perform sorting and merging to prepare and validate data.
- Worked with financial data like analyzing the salary sheets, inpatient and outpatient billing.
- Create CDISC SDTM and ADaM data sets from raw clinical datasets
- Converted MS SQL server data tables into SAS data files using SAS SQL and uploaded SAS data files into MS SQL Server tables using PROC DBLOAD
- Generated reports using PROC TABULATE, PROC REPORT, DATA NULL, SAS ARRAYS, PROC SQL, AND SAS MACROS.
- Built Macros and create macro variables using %Macro and %Mend, and DATA _NULL_ to help generate analysis data sets and create a specified structure.
- Ability to export SAS files to various output formats such as HTML, Excel, and PDF
- Able to use data manipulations procedures such SAS/BASE, SAS/MACRO, SAS, SAS/GRAPH, SAS/ODS, Means, Report, Print, Univariate, Freq, Summary, Export, Import, Infile, ODS, Sort, Set, Merge, Tabulate, Format, Informat
- Created, analyzed and summarized safety and efficacy SDTM, ODM and ADaM dataset associated with clinical studies, as per the specifications in the SAP (statistical analysis plans) at study-level and project-level.