Give it another minute or so just to allow more people to join and then we will make a start.
OK, well we are past 11 so I am going to make a start.
Welcome everyone to this webinar where we are going to be talking about the migration to Version 2 of the IAPT data set and the migration to the new submission service, SCDS Cloud. We will be using this session to focus primarily on Data Quality.
My name is Paul Arrowsmith and I am the Delivery Manager for IAPT Migration, and we also have Amol who is one of our Senior Developers, and he is going to be helping me with part of the presentation.
So just a bit of housekeeping before we get going. We are expecting a fairly large number of attendees, so to avoid any excessive background noise, we are going to mute everybody as they join the call.
Similarly, with the large numbers we missed that struggling with bandwidth, so can I just ask that everybody disables their camera please. And again, just to keep everybody on mute if we can. If anybody has any comments or questions. If you can raise those through the chat.
facility, that would be appreciated. Just to make you aware, we are recording the webinars so that we can make it available on our website for people who have been unable to attend.
In terms of structure, I am going to spend the first 10 minutes or so giving him an overview of the delivery, how we've got to this point, and also calling out a couple of actions we need from you on the run up to Go-Live.
Apologies for those of you that attended the webinar few days ago. This introduction will largely be the same as in the previous webinar, and now there's one or two new pieces of information, but generally it will be the same, so it's going to be a little bit of a recap for those of you that were on the last one after the introduction, I'll hand over to Amol who will give us an overview of DQ reports and how to go about addressing the DQ Issues that you might come across in your submissions, and that should leave us with about 30 minutes to go through any questions you might have, which hopefully should be plenty of time.
I should also point out that we have other people on the call, including Gavin Harrison from the Data Set Development Service. So, between us we should be able to answer any questions you have.
So just looking at the timeline for this delivery first and I will point out that the colour code first so everything in blue, are items specific to the version 2 data set and the submissions to SDCS Cloud.
The yellow items are version 1.5 items and relate to BSP submissions and then the items in Grey, is the work we have been doing preparing the SDCS Cloud platform ready for Go-Live.
So, going back to Autumn 2019. That is when the version 2 ISN was published, and the key activities I have called out here in terms of preparing for SDCS Cloud are really about risk mitigation.
Some of you may have been involved in the onboarding of Mental Health or Maternity datasets to SDCS Cloud last year, and you will be aware that there were some issues around those deliveries. So, we try to address those with the IAPT data set.
First at the beginning of March we migrated all active users from BSP, so at least that cohort of users did not need to go through the manual registration process. And then we ran two phases of private beta, one at the end of March and then again between the 13th of May on the 9th of June.
Obviously, you are all aware that we were originally on a path to go live on the 1st of May, but because of the COVID situation, the decision was taken to extend BSP for a further three months. It was this decision really that gave us the opportunity to run this second phase of private beta, which gave us some valuable learning which I will come onto later. I am looking at the timeline for data submissions.
BSP submissions have been ongoing each month throughout this timeline, and the last BSP submission will open on the 4th of September for July refresh data. And will close on the 25th of September.
In parallel with that, we have the first submission window and SDCS Cloud opening on the 1st of September for August primary and
closing on the same day as the BSP window on the 25th of September. And then finally, on the 1st of October we’re operating purely in an SDCS Cloud world with the submission window opening for August refresh and September primary.
If you would like to see other submission dates, these can be found on the IAPT webpages on our website. What I should just point out though, is that we have recently taken a decision considering the DQ issues that we’re going to come on and discuss. To give submitters more time to resolve any DQ issues this August refresh window will be extended by another month.
So, from the 1st of November. The standard windows of September refresh and October primary collections will still be taken, but it will still be possible to continue submitting August refresh for that month as well. So, all in all, you will have three months covering August data.
So just a few more words about registration then. This webinar (or at least a version of it) was first delivered on the 28th of February. A bit of a typo there, sorry, was the 28th of February not the 26th of February, but it was around about the end of February and some of you may well have dialled into that call and following on from that we automatically migrated 98 active users from BSP onto SDCS Cloud.
The thing is that the use of SDCS Cloud is dependent on two factor authentications to verify the user’s identity. So, the message in that first webinar was really targeted at those 98 users to encourage them to log onto SDCS and complete their two-factor authentication setup so that any issues are resolved ahead of Go- Live.
And as of 16th of April, going back a little bit in time now, we were already in a good place. 50% of those users had done that set up. With the three-month extension and the flurry of activity around the private beta, we put this registration work on hold for a while, but we are now picking it up as we head towards the 1st of September.
Since the migration in March, we have had 41 new users that have followed the manual registration process. And following the private beta they have been added now to the SDCS Cloud user base. So now we have about 90 users that still need to complete their registration.
So, at the bottom there, in the penultimate bullet point. Those 90 users, it is imperative that those 90 users, if they have not already done so, log onto SDCS Cloud and complete your two-factor authentication so that we avoid any registration issues in the Go-Live. This is something that hit us hard during the on boarding of the Mental Health and Maternity datasetsso I cannot stress enough the importance of completing this action. Just a word about the note at the bottom, if you already use SDCS Cloud, say for Mental Health or Maternity submissions. If you were also part of that migration of IAPT accounts from BSP. You just need to confirm which email address is associated with each account. So, if you are using different email addresses for SDCS Cloud BSP, then you will still need to complete the two-factor authentication on your migrated account. If the addresses are the same, then you’re good to go. The migration will have handled everything for you.
So, I am going to come on and give you an overview. Give you a flavour of what happened in private beta.
As you can see, we had 19 sites involved, although this was 19 organisations. Some of the organisations have multiple sites, so in total we had 35 sites involved in the private beta. There is a little bit of sensitivity here because I put the sites in order of how successful they were in eliminating or reducing the DQ errors. So, I have anonymised the names of the sites involved.
The main thing I want to get across here looking in the second column is the number of times the sites had to submit data. And I should point out that this was over the course of both phase one and phase two of the private beta. But there is a significant number of submissions that had to be made to get the level of success that we achieved in private beta, and that is the key learningfrom the private beta that we want sites to take forward into the Go- Live. The final three columns show the statistics for their final and generally best submission. In terms of the colour coding here, colour coded Green means that the final submission was completely successful, so the file was accepted into downstream processing and none of the records were rejected. All the ones in Amber had some level of record level rejections, but you can see in terms of the volume of the file and the number of records rejected, the volumes are very low. So it gives us a high degree of confidence that we are going to get good DQ out of these submissions.
So, the key findings from the private beta then. I think the first thing to understand is that the private beta was based purely on data from December 2019. The idea being to compare the publication measures from the private beta with those presented in the actual December publication from BSP and on the whole we found a pretty good match between the publication and the private beta measures, which gives us, as I say, it gives us confidence that we're in a strong position for go live. As I said, it took a significant number of submissions with the private beta sites working iteratively to eliminate DQ errors.
The SDCS Cloud service provides DQ reports back to users, but they do have a different format from those
provided by BSP and we realised during these private betas that the sites were sometimes struggling to understand how to use these reports. So, we've taken that learning and used it to produce a guidance document which is now available on the SDCS Cloud pages on our website. And in a minute Amol is going to focus on these DQ issues and provide an overview of the DQ guidance document.
So, the second action I want you to take from this is to recognise the issues experienced by the private beta sites. Take in what Amol says over the next few minutes. Try and make time to review the DQ guidance document in more detail.
And critically, please try and submit data as soon as possible after the 1st of September to give yourself enough time to work on your own DQ over the course of that first submission window. I will ask if anyone is going to have a problem in submitting data early, please put something in the chat facility so that we understand the issues you are experiencing at a local level and we can perhaps look at ways to help you out.
OK, so that is it for me. I am going to take a breather and hand over to Amol who is going to cover the DQ section.
Over to you Amol.
Hello and Good Morning everyone and thank you for joining this webinar.
The main objective of this webinar is to understand validation error types and how effectively we can identify those fail records and how easily and quickly we can correct them. So higher data quality will be achieved in each submission. These slides in Data Quality have been prepared based on our experience of pilot 1 and pilot 2.
So, in principle there are 4 main types of validations, first one is file level rejection, record level rejection. A group level rejection which further can be subdivided into more than one group submitted and no valid group submitted and the last one is warnings. So first most important one is file level rejection. So, file rejection is a validation error that highlights a specific data quality. If this is triggered, the whole file will be rejected so we need to be careful of when submitting data so that no validation error with the trigger of this type. So, have just given a simple example. There are a few mandatory tables in the IAPT data set. One of them is a header, and if the header table is empty, the system will trigger IDSREJ002 Failed Content Check.
Header table is empty, and the whole file will be rejected. There are few other file level rejections which you can identify and go through the Technical Output Specification (TOS), there is separate tab in the TOS for file level rejection.
Second type of rejections are a type of record level rejections. And this type, particularly, a validation is that highlights a data issue in a specific column and which causes the whole record to be rejected, even though the file will process the rest of the record. But it will significantly reduce, or it will reduce depending on how many other record level rejections are present in the file. So, this is an example of IDS002 GP table. It's mainly a pretty standard error where local patient ID is more than 20 characters and first record on this table will be rejected with a record level rejection, which is IDS00202, where local patient ID has incorrect data format. This type of error is easy to identify from data quality report, LPI00000........1 I a key. So, you can easily identify and correct. And this is pretty standard across most of the record level rejection where those key values are present in the DQ report to identify those records which are causing problems.
Third one is a group level rejection. This this is divided into 2 types, more than one group submitted, and no valid group submitted. So first one - more than one group submitted. This is very easy to identify, it is like just a standard, a primary key or a composite key type of validation. So, we purposely tried to put a composite key validation example from IDS007 table.
If these two records are submitted in the submission file, then a group level rejection will be triggered because there is a duplicate local patient identifier plus disability code combination. So, it is a composite key duplicate record. There are other types of group level rejection where it looks for a single column to identify duplicate record. But in this example, it looks for a combination of local patient ID and DisabCode and the validation message shows you exactly which keys LPI001 and disability code 01. So, this type of validation is also easy to identify from data quality report.
The next one is No valid group transmitted example; this can occur because of 2 reasons.
Reason 1 in this example we have given where it could be triggered if the record in the child table rejected. There are no corresponding records present in the parent table. So we tried to put a very simple example here which can explain the scenario in IDS001MPI table where no LPI001 record is submitted here but only LPI002 is submitted but in GP table which is a child table of MPI the 2 records are present.
First one is LPI001 and LPI002, so in this case for the first record in GP table there's no corresponding record from MPI table and in this case the system will trigger IDS00220 group rejected as no valid IDS001 group transmitter for this local patient identifier. This type of error is again easy to identify. We will go through in detail in a short while after the next slide the most confusing one is when the record in the child table are rejected when the corresponding records from parent table fail due to validation error. Most significant difference between previous examples and this one in this case LPI002 record is present in both tables. In LPI001 and LPI002 GP table and in previous scenarios, LP002 was not submitted. In this scenario because the record is present.
But ExBAF indicator is invalid, because of that IDS001MPI table record will get rejected with IDS00159 validation error and because MPI table record is getting rejected. System will trigger, there is no corresponding MPI table record for GP table and the system will trigger IDS0220 group rejected. No valid group transmitted for local patient identifier and the question is why we are doing this, reason is because IDS002 record is meaningless if no corresponding parent record from the MPI table and this will preserve referential integrity of the system.
To understand this scenario in more detail, go to this cloud page we have created this SDCS cloud data quality guidance document. When you click on that move on to page 11 It is a very useful document we have put together lot more detailed scenarios and this is a pretty standard guide which is valid across all datasets including MHSDS, Maternity, CSDS as well as IAPT and we have identified the top ten validation errors in pilot 1 and pilot 2, both from IAPT and CSDS, based on that we have drafted this document to help all the sites to identify all the validation errors when you come across group level rejection, how effectively you can tackle it. First you need to check if the record is present in the parent table which can be done easily by filtering data quality reports. If there are any failures in the parent table then there is quite a good chance that this group level rejection is triggered because of parent table rejections and to identify parent child relationship it is on the next page.
We have added parent and child table relationships for each data set. This is IAPT and if you scroll down further the Community as well as Mental Health and Maternity datasets there. So, this table is representing parent child relationship. This is colour coded, for example if there is a group level rejection in IDS006 or IDS007 table then you need to look for its parent which is 202 care activity and if and then you need to travels through right across 201, and the 201 parent is referral and so on. This table will help you to easily and effectively identify parent child relationship and to identify those fail records. I would recommend you go through this data quality guide. I will just quickly show you the index table which gives all the details of validation error types. What common issues and resolving data quality errors. This is very useful document.
The final one is warnings. Warnings are where the validation process identifies and defines an issue with the submitted file but does not reject any records. If there are still some data quality issues and if you correct them you will achieve high data quality stats. In this example like IDS007DisabilityType table where invalid DisabCode is submitted and system will trigger IDS00707 warning. Disability code contains an invalid disability code. Again, it is defined with a key which is LPI1111 and disability code 11.
You can easily identify this type of errors. That is all, we are now open to any questions. Thank you.
Thanks very much Amol. OK I am going start working through the questions that have come through the chat facility.
Gavin can I ask you to help as well? What tends to happen when I do this I answer when things I have been answered further down? So, if any questions and the answers come through while I am talking, please shout out and then I am not repeating things. OK, I am going to scroll back up to the first entry, which is 6 minutes past 11.
Can you do both primary and refresh access file like we did in BSP? This has come up in previous webinars and the answer is no, they are separate. The separate submissions, Saskia has added there, her response that you do need to submit two separate IDB'S, one for primary and one for refresh and as per MHSDS.
Next Question - I am afraid I am going to struggle with this and I'm not sure, if anybody else on the call is going to be able to answer this. This is more an SDCS cloud registration type problem and not sure we've got the relevant people on the call. My understanding is that typically the two factor authentication is through your mobile phone, but there is a desktop application available as a separate option if you don't want to, or you can’t, or you don't have a works phone or whatever the problem
might be with using a mobile. There is an alternative solution and my understanding is that all of those issues that we experienced in the early days of SDCS Cloud have been resolved, but I'm not going to be able to answer this specific question on the Auth Desktop app. Can I just ask people from NHS Digital or anywhere else whom has experience with this? If you can offer any information on that over and above what I have said?
Amol here, there could be an option upgrading your Authy app. I was using the old version and I reinstalled the new version of Authy and then the migration, so you could look for the migration guide on the Authy website.
OK thanks Amol, I hope that helps. Obviously, any further questions, fire them in the chat further down.
So, from Allison. For information, our system, supplier Mayden is not enabling the version two submission until the 10th of September. OK, this is the first time I have heard about it. I was aware of it and we are aware of this with IAPTUS users when we were on the 1st of May trajectory.
They were planning to do the switch over on the 13th May. I was assuming it would be the 13th of September in this case as well, but obviously they decided to go with the 10th now. So, anybody who is and IAPTUS user, you probably had received this communication multiple times from Mayden anyway, but just please be aware that you need to do your final BSP submission before the 10th of September at which point your system will switch over and you will be able to start submitting your first SDCS cloud data.
Please ask if 16 out of 19 pilot sites struggle to provide the submission. And if so, what are the main hurdles? OK, I guess the high level is response the DQ guidance document. The main purpose of that document is really to articulate and provide guidance on the issues that the pilot sites experienced. I guess that would be the high-level answer. There were other things that it is probably worth taking note of. One of the key things that hit us early in the pilot is that the providers were struggling with these submissions, because of lock down, because people were working remotely, a lot of people were struggling with bandwidth, so they did suffer a timeout. So, then they had to find out some way of working remotely on a server to get the submission in rather than using local infrastructure. So, we had a lot of problems like that. So, will we still be in that place 1st of September? I guess a lot of you will be so it's just worth bearing that in mind that that can be an issue if you start seeing a timeout. Typically, it is because of a bandwidth problem with your local infrastructure.
I do not know, there are probably lots of issues we could get out there Amol - is there anything else you wanted to add in terms of what the main hurdles? You’ve obviously called out the group level rejections that you have already talked about? Is there anything else that you specifically like to raise?
No, I think we have a main high-level error to understand validation errors in which we are covering now in our DQ Validation guidance document. So that will really help everyone.
Next question, we cannot make a test submission. The button is grey. Not sure why. OK test submissions. So, this is a functionality that has recently gone live on SDCS cloud. It is currently only available for Mental Health and Maternity. And that is because you need to have a submission window open for the test submission to work. So technically speaking, you are not going to see it for IAPT until the 1st of September. However, we are working with the SDCS Cloud team and trying to get to a position where we can open up a window before the 1st of September, which will allow you to make some tests admissions early and hopefully you will get DQ reports back as part of the test submission. So by allowing you to submit their data say in August, if we can get a window open in August, it would allow you to send those tests admissions in and start testing your DQ issues that you might be experiencing locally. If we manage to resolve that with the SDCS Cloud team, then obviously will communicate that out separately and make everybody aware that that is now available. But as things stand it will not be there until the 1st of September.
Next Question - Where will we be able to find this recording.
So, the recording is probably going to take us a few days because we need to make sure it's satisfying Accessibility rules. We need to do some work on the transcripts once the recording is finished, but hopefully by the end of this week. We make that available, we think at the moment will be publishing it on the IAPT web pages directly, but there may well be a link out to it from the SDCS cloud page as well, because obviously some of the content relates to SDCS cloud.
Next Question - Will the file be rejected if a clinical contact is submitted against care personnel now where no qualification is recorded in the therapist qualification table.
So, this is one for you Gavin.
It will not reject the file so that there are two tables in question here. This the IDS202 characters table where you identify the care personnel. Now it will not cause any problem with that table. If you did happen, then you can capture more details about the qualification of that care personnel in the 902 table. If you do, try to submit that without qualification then just that 902 record would be rejected. But your file would still come through because there is no point submitting a care personnel qualification table. If you do not tell us what qualification is right, answer the original question your file will not be rejected under 202 care activity record will not be rejected over, it would just be care personnel qualification. It is explained a bit more in the technical output specification.
OK thanks Gavin.
Next Question - From the 19 pilot sites be split by the system supplier.
Yes, they can. We do have that information but it’s going to be difficult for me to give you specifics on this call. What I can say is that across those 19 organizations, as I said in my presentation, 5 of them were IAPTUS users. 8 of them were PCMIS users. One was CORE IMS and then the rest of them were Inhouse developed systems. So, it gives you an idea of the split, and certainly with the likes of PCMIS, we went through several iterations with the supplier rather working directly with the sites. So PCMIS probably IAPTUS as well, have done a lot of work on the reports that are generated out of the system, and then they are used to generate the IDB. So, anybody who uses those systems will receive the benefit of all those improvements that were made to the system off the back of the private beta.
Next question - is there a document that summarizes all the possible rejection codes? And what they mean.
So, I think Gavin I saw something where you called out the technical output specification in response to this, I think. Yeah, so that is the first part of the question. It does not answer part about what DQ report would look like out of the system
No, I do not think we provide anything like that in the guidance document do we Amol? Is there anything you can say in response to what the DQ report looks like?
No, it is not in guidance document, but this is if you have access to Mental Health or Maternity then you can download and see the examples. But if not then Paul, is it possible we can like publish a sample DQ report or send them separately?
Yes, and then we could look to publish that on the same page as a guidance document.
Yes, we can, we can take this offline
OK, yes, we will take that action and take that back and then look at that.
Next Question - Is there any list of fields where they were not mandatory before but are now mandatory.
So again, I suppose the high-level response is that it is in the technical output specification, but is there anything else Gavin?
It goes back to when the information standard was published. The document you wanted to change specification. Please look for the
link to the DCB web page. You can see that that just got through the item level of before and after. So, you can compare side by side.
Look in specification on the DCB website.
OK, and then I think there is another one for you here.
Next Question - do records get rejected for spoiler information, EG. When a GP practice end date is after the period end date.
I will double check that one. We do not usually allow future dates because I'm looking at specific reporting period, but I'll just check that specific example.
Yes, if the start date to the GP Registration is after the end of the reporting period then that GP record would be rejected and then as a consequence the patient record would go. So, in the example of the September date, so we would not be expecting a start date in October. For example, we would not want that information till you do your October submission, so that's what's possible causing that.
OK, then there obviously the questions here around where we find the guidance document which Ashley has addressed. Thanks for that Ashley.
Next Question - Do you have any example table to display the DQ from any of the pilot submissions?
We certainly cannot do that today and I think the problem would be sensitive PID data. Because it was live data coming through from the pilots. But what we can do is that sample table that we just talked about, so hopefully that will satisfy that requirement.
Next Question - Do all records need an NHS number?
Again, Gavin I will let you take this one.
Yes, technically we do, wherever possible, expect to submit that but it’s not mandatory though. But if you do not submit, you will get a warning. The other caveat with that is it's got to be valid in NHS number. If it fails Modulus 11 checks, then it will actually reject the entire file. We have got to use a valid NHS number to any testing. So no, you don't have to submit it, but if you do submit that would expect it to be a valid one, but you may get warnings if you don't submit additional, so the message is try and submit it and make sure they are valid.
OK thanks Gavin, there is a few comments here about that authorisation on SDCS Cloud, so thanks.
For those of you that have submitted those comments.
Next Question - The main issue we had with the beta submission was out of areas which don't have a CCG code. As we don't have every out of area GP surgery on our system, is it recommended to change this, so we include the outer area CCG codes?
Yes, this rings a bell actually from the private beta, Gavin or Amol.
There are a few places where we capture the CCG code so I might be answering about the wrong area. We do try and capture in the GP table or it is not a mandatory field. So again, if you don't have that information your system and it's a required field then we don't expect to try and get that. You do need it in the referral table though. It is mandatory there. So, do not know if that will cause an issue there. You will have to submit a code for the organization identified for the code of commissioner. Usually would expect that it is going to be the commissioner. It is your own commissioner. Most the time would expect, but I think the example given here was that now say it looks like it was around the GP Table, yes if it's in the case, the GP table and you generally don't have that, we can do local mapping as far as derivations. What should I figure out what the CCG responsible for that one is so you can leave that one blank in the GP table? Would expect that in the referral table because you are the ones submitted this referral that you it probably your CCG that is responsible for that. Hopefully that answers your question, but if not, do follow up.
Yep, OK, thanks.
Next question from Samuel. I got migrated to SDCS from BSP but only have access to just one teams data.
They are asking me to complete a separate DUC form for each of the services before I can gain access to the rest. This cannot be right.
I am not sure totally understand the question, I think it probably is correct. If I'm reading it correctly, the fact of the matter is, is that you need to have approval from your SIRO to say that, you're permitted to have access to particular data set.
So yes, it is right that you need to submit separate forms for Mental Health, Maternity, Community and IAPT if that is what you're asking then yes, that is right. We need to have that clarity around each data set.
I am assuming that's what you mean. I think we got Tanya on the call. Please jump in if there is anything else you wanted to add around that, but.
No, that was absolutely spot on, so yes, we need to have a form for each organization and for each data set. Yes, thanks Tanya, unless I have misunderstood the question, in which case please send something else into the chat.
Next Question - The issue I found with the two-factor authentication was that because they can be different times on the mobile and computer month 30 seconds apart, it can fail. So, it is better to use the desktop app for the authenticator instead of the mobile. OK, that is interesting. Thanks for that, that might be useful for other people if they are experiencing the same problems with the mobile.
Next Question - can we submit blank GP start and end dates for GP practice Codes submitted in the GP practice table?
Technically can do just one thing to bear in mind. We will only know we need to know what your GP is to decide which CCG it belongs to. So, if you don't submit any start and end dates, you're going to be able to submit one GP practice because otherwise we wouldn't know which is the relevant one. So, if you are trying to submit more than 2 records with no start and end dates, then would reject both of those and then as a consequence reject the patient tables.
You can do. We would definitely prefer to see the start and end dates there, but if you generally don't capture those in your system, that there is a way around that, but it must only be one record so we will assume that is the active GP Practice about that patient.
Next Question - I have some concerns regarding the time that submissions are likely to take. We don't have a full-time data lead or a IT team. Do you have any idea how long this process will last be based on an average amount of rejections? I could see that some services had to make 20 plus submissions would be good to know how much time to free up in the diary.
It is a good question, really. I guess, What we're trying to do is mitigate against all of that, and the fact that we're on the private beta and we went through all of that work with those pilot organisations it’s enabled us to to home in on exactly what the problems were. As I said, if you're a PCMIS or an IAPTUS user it’s likely that you'll benefit from the problems that they found and fixed in their systems, so you're starting from a better position than the pilot sites, and then, as I say, we've run this webinar, we've developed the guidance document, we are giving an extended window period for the August submission, and as we talked about earlier as well, we're trying to get the test submissions up and running before the 1st of September if we can. We think by putting all that package together we think it is going to help organisations out and you won't need to go through quite as many iterations as the pilot sites. In terms of timing, it,s quite difficult to put a figure on it. It obviously depends on how many people are submitting data at the same time, but typically we were able to turn it around quickly in terms of the submission itself. I can't answer for how long it took the providers to build the data in the first place. Maybe if somebody wanted to post the information into the chat, feel free to do so, but in terms of the submission itself, we were experiencing pretty good turn around rates, so people typically receiving the DQ reports in a matter of minutes following the submission. Certainly, we did some tests halfway through the private beta where we confirmed that every single site definitely received the DQ reports within 15 minutes for instance.
So yes, this submission process itself is straightforward, did not seem to have much of a time impact. But it will depend on the number of the DQ issues and any work you can do up front to interpret the DQ guidance, and absorb what Amol's gone through today I think that would definitely help.
Next Question - you mentioned using the two-factor authentication on work mobile phones, I will send it to the desktop version of the app.
can you use own mobile phones for this?
Yes, I think Amol has responded and said you can use personal mobile phones and Next Question - Just to clarify, submit for RVN and RVNCG, do I need separate cloud registrations?
OK, it is a good job you are on the call Tanya. Because it's two separate organisations. They would have the same SIRO, but to register for both, you would need two separate registration DUC forms. Thanks Tanya, you got me out of a hole there.
Next Comment that the first MHSDS submission I did on SDCS Cloud took 5 submissions to achieve 0 rejections, each one took maybe 2 to three hours.
OK, that's useful insight there in terms of the local impact in terms of building the submissions. Hopefully that helps to give you some insight.
Thanks for that.
Next Comment - I was also involved in the pilot. We used IAPTUS and it took me quite some time to extract all the individual tables to create the IDB. The actual submission to the portal is quite quick, getting a successful submission is another kettle of fish.
Yes, I mean we worked a lot with them and other people in the pilot to work all the DQ issues. And as I say, that's why we've tried to provide as much guidance and assistance as we can. So I think you just need to be mindful of the fact that you know it is going to take some time to do this iterative work, and that's why we're keen to get people submitting as early as possible in that submission window. To give themselves as much time as possible to get through that work.
Yes, resubmission work takes us around 1.5 to two days to do.
OK, I think that’s got us to the end of the chat.
So hopefully we have addressed everything.
A few more comments coming through about the length of time it is going to take.
Next Question - is there going to be any direct helpline or team specifically for the first few submissions to guide us to understand where items are getting rejected? Until we are familiar with DQ reports?
Yes, I mean, we are still working through exactly what level of support we are going to provide. Working with NHS England in this regard, but certainly there will be our standard service management approach to registering issues and specifically DQ issues and there will be some level of support and I think also, if any of you had an experience with our Data Liaison Team.
We are thinking of drafting that team in so we may be working in a more proactive way with you as well.
Looking out for sites that perhaps haven't submitted or having issues with submission and then working with those sites directly to resolve those issues, so we're looking at options around that and will send further communications out, but will definitely be more support available to sites in those first few weeks.
Next Question - Is there any additional funding available? The answer to that as you'd expect is no, it depends on how much of an extra workload it is I suppose, I guess that the hope would be that all these mitigations around DQ that the extra workload will not be that much significantly more than what you suffer with BSP.
An interesting point raised here. The main hold up with progressing MHSDS submissions in the first month was the time taken to get response from National Service Desk. I think that is right. I think we’ve probably moved on quite a lot since that time though, so obviously the platform itself was brand new as well for Mental Health, and we have had a stable platform for about 12 months now. And we were also suffering a lot with registrations, which is why we put a lot of effort into the registrations for IAPT and a lot of those calls actually were to do with password resets, and that's again, that's new functionality that's been introduced to SDCS Cloud, so it's now possible to reset your own password and you are no longer dependent on National Service Desk to do that. So, I think there are a few things in place that mean we should be in a better place when we Go Live with IAPT compared with that Mental health Go Live.
OK, again, I think I have got to the end, but I'm going to keep this open for a few minutes anyway.
So, if anybody does have questions, please keep posting. Lisa, can I just ask you to post the standard data processing email address?
Because what I was going to say obviously like I say, keep posting questions through to the chat facility, but if you do think of anything else after the webinar, or you just want to take it offline for whatever reason please send emails into our data processing email address that Lisa is going to post in a short while, and we will pick those up offline.
And there it is - firstname.lastname@example.org.
Next Question - Is that the link that you provided Gavin?
Yes, I have provided the direct link to the document.
OK, as I said, I am going to keep this webinar open for little while, but I am conscious people are starting to leave so I will just offer my thanks. Thanks very much for everybody who has joined. Appreciate you taking time out for this webinar today.
So just posted the link directly to change classification there - https://digital.nhs.uk/binaries/content/assets/website-assets/isce/dcb1520/1520142019changespecoctober2019.pdf
Next Question - asking when will the recording of the webinar from today will be available?
OK, well sorry, I have done, my machine seems to be on the go slow all of a sudden. Yes, so as I said before, we need to do some work on making the webinar and making the recording. Making sure it meets accessibility rules, so we need to do a little bit of work on that and producing a formal transcript for it. And we think that probably take the rest of this week, so we aim to get it published towards the end of this week, maybe early next week.
OK, it does look like the questions are drying up a little bit now, so I think I'm going to draw this to a close, but as I say, if anybody come just think of anything else they want to ask.
Please send the questions through to that email address that Lisa posted earlier. OK, thanks very much everybody and will close it there. Goodbye.