- This topic has 28 replies, 13 voices, and was last updated 3 years, 7 months ago by Kamonporn Suwanthaweemeesuk.
-
AuthorPosts
-
-
2021-03-25 at 3:32 pm #26677SaranathKeymaster
To ensure data quality and integrity, the data management process should have the following process.
– Audit trial/Time stamp
– User authentication and access control level
– Edit check and logical check
– Data backup and recovery planDo you have experience conducting a study (or have seen other study projects)? Do those projects have implemented these process? Which computer software that you/they use to store and manage data?
-
2021-04-02 at 3:49 pm #26844Auswin RojanasumapongParticipant
I have experience with 2 data electronic data capture methods; Google Form and REDCap.
My experience conducting a study involving these processes
Audit trail/Time Stamp
-> Google Form: I have used Google Form to capture data before I learn how to use REDCap. It provides a timestamp for a new record. When I first collect the data, I did not plan for an audit trail and timestamps of other events except adding new records.
-> REDCap (Research Electronic Data Capture): this application provides a timestamp record and audit trail (it has event logs that capture any activities happening in the database eg. add new data, correct data, delete data)User authentication and access control level
-> Both EDCs I have used (REDCap and Google Form) can set the access control level for each personnel. REDCap is better since it has more configuration to limit the access or edit the data.Edit check and logical check
-> I set the input form to automatically validate the data input for every variable. One of the settings that I used frequently is to set the variable to ” * must provide value” to avoid missing important data. Both methods I have used provide this function, but REDCap offers more configuration.Data backup and recovery plan
-> Both methods are online platforms. I tried to backup manually by downloading the data frequently and saved to my computer.
-
2021-04-03 at 12:35 am #26847Paravee Own-eiumParticipant
I don’t have much experience and have never heard of REDCap before, but now that I look it up, it looks like a great application for building databases. It has many functions that support GCDMP and seems customizable as well. Thank you so much for sharing your experience.
-
-
2021-04-03 at 12:07 am #26846Paravee Own-eiumParticipant
I have entered data into databases a few times. The programs were Microsoft Excel and the project’s website. As far as I know, there was no edit check or logical check, but the website was well-designed, making it easier to enter data. The website also used user authentication and recorded timestamps and user names of people who entered the data.
For data backup, the project I have participated in keeps copies of data in several places, such as the institute’s server and external hard disks. Also, when we do data analysis, we try not to work directly on the original files. We always copy the files and work on the copies instead to make sure that the original files are safe.
-
2021-04-03 at 9:45 pm #26872Auswin RojanasumapongParticipant
Thank you for sharing. I agree that working with the non-original file is a safe way to prevent the loss of original data. But working with copied files sometimes confusing, especially when the database is still open for a new record. There will be many versions of files created and downloaded.
-
2021-04-04 at 11:17 am #26885Pongsakorn SadakornParticipant
Thank you for sharing, I think quality control is very important to ensure quality and achieve the research purpose. Excel is a basic data checking software but if the data is many records, errors will be occurred due to human errors. However, Epi-info a good software to rid of duplicate data.
-
2021-04-05 at 11:33 am #26899SaranathKeymaster
There are many software available. There is no right or wrong when choosing the software. But the selected one should fit with the requirements of your work.
-
-
2021-04-03 at 9:39 pm #26866Sila KlanklaeoParticipant
I have experiences in
– Audit trail/Timestamp
– User authentication and access control level
– Edit check and logical check
– Data backup and recovery plan.
That Softwares are HDC (Health data Center) Hosxp, Hosxp-pcu (HIS). -
2021-04-03 at 10:49 pm #26881Wachirawit SupasaParticipant
I have experienced data quality control very frequently during work. My work involved as input data into the database. I would like to describe my routine activity that can correlate with the concept.
– Audit trail and Timestamp, the research database record when the value has been entered and who use the application at that time.
– User authentication and access control level, we have both physical and logical control. As entering the facility required authorization, currently, we use biometrics such as facial recognition at the entrance. By logical means, we restricted access to the computer with username and password. We also utilize “Windows Users Control” (UAC) that restrict users from accessing some file or modified computer system.
– Edit check and logical check. The application used edit check by control type of value, for example, the specimen number can only be number, if we entered character into the field, it refuses to save into the database. We also use double entry for verifying each other.
– Data backup and recovery plan. We have a physical backup at the remote locations that can be retrieved easily such as CDC server at Atlanta.-
2021-04-05 at 11:38 am #26901SaranathKeymaster
It’s cool that your work has remote data backup to the CDC server. I’m sure the data security will be very strong.
-
2021-04-05 at 2:17 pm #26905Sittidech SurasriParticipant
Thank you for sharing a very good experience. I’m curious to know to know that what level or who can access to see the data (timestamp)?
-
-
2021-04-04 at 11:10 am #26884Pongsakorn SadakornParticipant
I have experience with the data management process which is about conducting a national application named “TanRabad”.
1. Audit trial/Timestamp: TanRabad is software so audit trial and timestamp are always capturing when the data is changed.
2. User authentication and access control level: TanRabad has authentication and access control which is concerned by public health staff and who is going to use this application will be registered.
3. Edit check and logical check: at first TanRabad face a lot of errors in data entry in TanRabad-Survey [larva survey application] so the researcher team was conducted the edit check process to help potential users edit some values such as the name of the place, location, merging place, etc. Nowadays, TanRabad can rid of the garbage data which is entered inappropriately.
4. Data backup and recovery plan: Data is backup every month and we set a restore point in case the system down and some damage occurred to the database. -
2021-04-04 at 1:19 pm #26888Saravalee SuphakarnParticipant
For my previous experience, I did all these process (User authentication and access control level, Edit check and logical check, Data backup and recovery plan) except audit trial or time stamp process. However, in very process didn’t fully operational and still had the weak point or defect that should improve in the future project.
I have ever used Google Form as a eCRF to collect data and export to excel file format Then used Microsoft Excel to manipulate and analysis the data. Google Form can set the access control level for the data. Study site staffs who are collector or interviewers can’t access to the collected data or change it, just authorized person can access and change it. Data manager can observe and check collected data that been sent to the system thought function of Google Form or link to Google Sheet. The platform doesn’t allow to sent feedback or data clarification form to the collectors. After closed the system, collected data are stored in Google cloud and exported as Excel file for backup in hard drives.
-
2021-04-05 at 11:49 am #26903SaranathKeymaster
Thanks everyone for sharing your experience. Many of you have used Google form. Google form is a good tool for collect data. However, for a real clinical study, the google form may not fit the standard requirements.
-
2021-04-05 at 8:59 pm #26922Navinee KruahongParticipant
This might be a silly question, But…Can I ask you why Google form might not fit the standard of clinical study?
-
-
2021-04-05 at 2:08 pm #26904Sittidech SurasriParticipant
I did have experienced for several studies (clinical trial and non-clinical trial). However, most of my role and responsibility were the person who provided the data, have been involved in some part of data entry and few chances to take role as a monitor to check the completeness of patient records, the accuracy of entries on the CRFs, the adherence to the protocol and to Good Clinical Practice, the progress of enrollment, and to ensure that study drug is being stored, dispensed, and accounted for according to specifications. As I have mentioned that I have been involved in several studies and have seen from other studies as well. I could not say that do they have implemented these processes or not, but I am sure that they did especially in the clinical trials.
About the computer software that they used to store and manage data, I have seen many programs/software that they used such as excel, access, eCRF (InFormâ„¢, DFexplore). -
2021-04-05 at 4:27 pm #26919NaphatParticipant
I have experienced to ensure data quality and integrity by logbook as follow;
1. Audit trail/Timestamp
– The log in and lock out time of the program and the activities or modified in the database are recorded
2. User authentication and access control level
– Change password every 6 months
– Claasified of the level of access to the data of the workers (data entry level, supervosory review level and admin level).
3. Edit check and logical check
– Date and time; Some projects is not definite and clear of format in Date and time (DD/MM/YYYY or 12 hr or 24hr format)
-Supervisory review in Specimen no. and subject no. are matching.
4. Data backup and recovery plan (not current jobs)
– Weekly back up on server
– Back up on hard disk monthly -
2021-04-05 at 8:53 pm #26921Navinee KruahongParticipant
All of these processes- audit trial/time stamp, access control level, edit check and logical check, data backup and recovery plan, are crucial for data quality and integrity. Unfortunately, I never have any experience on clinical research or any research that use these processes. However, I do have some experience on a systematic review which is my dissertation when I was study my first master degree. I adopted the PRISMA guideline to conduct my systematic review which had a data management flow like we are studying.
For example;
– On the step of searching for relevant studies, I needed to stated the date of my search along with my search strategy;
– I had my peer to cross check my abstract and title screening and extracting data.
Even I don’t have an experience on clinical research that use these full steps, but I do have an experience of using data from Web-application for mental health screening which has a strict protocol and policy on data privacy and security. So I need to get a permission from a committee of my department to get a password before accessing to people’s data. I do believe that we have data backup plan but I just don’t know how my department storage the data. I need to ask them next time 🙂 and I just found that we really cannot do editing check and logical check when dealing with a large data set, right?-
2021-04-06 at 12:18 am #26930Khaing Zin Zin HtweParticipant
It’s a precious experience you just shared to us, and thank you for that. For edit checks and logical checks, they are crucial especially for large data sets in my opinion, because it’s near impossible for us to do manual queries to a large number of records.
-
2021-04-08 at 3:12 pm #26983SaranathKeymaster
It depend on your role and responsibility in the research whether they would give you an access control into which level. Some levels can performed the edit check (admin) or logical check (Data manager).
-
-
2021-04-06 at 12:11 am #26929Khaing Zin Zin HtweParticipant
I haven’t experienced study projects, however, recording TB cases for treatment monitoring and reporting is my daily practice. In this practice, case recording is done in web application developed by our team. It has been integrated with:
– User authentication and access control: each user has his/her own username and password. Features accessible are not the same among levels of different users.
– Edit checks and logical checks: eg., the conclusion of the investigated presumptive case being “No TB” with sputum examination result “positive” triggers a prompt box for manual query.
– Data backup and recovery: All the data on the app are backed up in a purchased virtual server.We have yet to discuss about adding audit trail in the app since I just started to realize its importance.
-
2021-04-06 at 1:35 pm #26934Rawinan SomaParticipant
Actually, I never conduct the full RCT or research in traditional way, but I will be shared some of my experiences. To prevent any accident, the data backup was created as soon as full data set was acquired in my recently work. I stored my data in three places; my laptop, external hard drive, and google drive platform. But I did not establish the data recovery plan yet. For checking and validating data, I started these processes as the exploratory data analysis step. It did not follow the best practices because there must be check during data entry, but I have not well planned enough at this moment.
-
2021-05-20 at 9:16 am #27422Kamonporn SuwanthaweemeesukParticipant
I have no experience in this field. I’m looking forward to learn more.
-
-
AuthorPosts
You must be logged in to reply to this topic. Login here