WEBVTT 1 00:01:38.550 --> 00:01:42.489 Keela Shatzkin: Hello! There! Hi! 2 00:01:43.980 --> 00:01:46.480 trevorbutterworth: I wonder how many people we're going to get today? 3 00:01:47.450 --> 00:01:50.049 Keela Shatzkin: Oh, are there fun things happening today? 4 00:01:50.830 --> 00:01:51.880 trevorbutterworth: Summer? 5 00:01:55.060 --> 00:02:05.670 trevorbutterworth: Well, on the east coast. The first full day of summer is rainy and cold. 6 00:02:05.740 --> 00:02:10.570 trevorbutterworth: Very good weather for the past 2 2 and a half weeks, which is exceptional. 7 00:02:10.690 --> 00:02:12.020 Keela Shatzkin: Wow! 8 00:02:12.980 --> 00:02:25.009 trevorbutterworth: Nobody like this. But I also had the. I also was stuck in meetings till 2 am. Last night. So 9 00:02:25.260 --> 00:02:32.260 Keela Shatzkin: wow! That's a day. 10 00:02:32.320 --> 00:02:35.579 Keela Shatzkin: so 11 00:02:35.630 --> 00:02:53.689 Keela Shatzkin: in a very. It's a small company. We have somebody who's going on maternity leave, so I will be working it. Can you take your work to the beach at least, or you know, yes, I can do my work as long as I'm on us. 12 00:02:54.260 --> 00:02:58.500 Keela Shatzkin: Governance. So yes, I could. 13 00:02:58.550 --> 00:03:11.180 trevorbutterworth: Okay, cause that's that's what I did. you know, I actually, I used to love June, because, you know, we friends had a place in Fire Island and Kismet, and it was. It was like 200 a week. 14 00:03:11.430 --> 00:03:17.750 trevorbutterworth: and nobody was out there, so I would just go out there for 3 weeks at a 3 or 4 weeks in June. 15 00:03:17.980 --> 00:03:23.109 trevorbutterworth: and what? 3 weeks in June. And it was great it was! 16 00:03:23.310 --> 00:03:30.640 trevorbutterworth: It was just my, it was beautiful and tranquil, and so I'm missing that. But hopefully fingers crossed. Next week 17 00:03:32.140 --> 00:03:53.419 trevorbutterworth: I'll be there for the July weekend. Amazing. And it's like a weekend. Yeah. Well, there's there's got a there's an awful lot of logistics required to be to to actually execute in order for that to happen. in terms of the elder care stuff. So yup! 18 00:03:53.800 --> 00:03:59.780 trevorbutterworth: Well, I I am rooting for that to happen me, too. 19 00:04:01.700 --> 00:04:06.920 Keela Shatzkin: All right. We can give people another a minute or 2. I don't know, as you said, how much we're getting here. 20 00:04:07.430 --> 00:04:09.320 Keela Shatzkin: but we'll see. 21 00:04:09.820 --> 00:04:17.540 Keela Shatzkin: and if nothing else we can go through the white paper in a very productive small working session. 22 00:04:18.399 --> 00:04:23.500 Keela Shatzkin: Let me just update. I was working on the notes feverishly messaging. 23 00:04:23.900 --> 00:04:34.570 trevorbutterworth: I I've got it. Yeah, I've got a feeling like I I the the the the the new system of connecting to the meeting might be 24 00:04:34.890 --> 00:04:49.260 trevorbutterworth: like I just I I was. I got in through Helen's. you know. I looked at Pellen on the on the calendar and so and then copied from her, just clicked on her meeting. Invite because my one seems to be stuck in. 25 00:04:52.550 --> 00:05:03.980 Keela Shatzkin: yes, agreed even I had issues getting into the zoom today. I could not agree with you more. It's a little hard to 26 00:05:04.660 --> 00:05:18.859 Keela Shatzkin: find like, where is the calendar? Even if you just go to Hyper Ledger like the calendar, events is kind of hiding. When you've got to shake this, my card here, meet Simon is struggling. Hold on 1 s. 27 00:05:18.890 --> 00:05:24.380 trevorbutterworth: Sure you want me to put it. I can put it in the cardio chat. yes, please. 28 00:05:31.720 --> 00:05:35.160 Keela Shatzkin: and I did it on slack, so if he's there, then 29 00:05:35.610 --> 00:05:40.370 Keela Shatzkin: he will find us. I know that's not our official tool anymore. But 30 00:06:04.590 --> 00:06:12.290 trevorbutterworth: yeah, I got a feeling last night knocked out a few people on the in mountain, in, on mountain. 31 00:06:14.570 --> 00:06:16.620 trevorbutterworth: Oh, there we go! We've got Ken. 32 00:06:16.880 --> 00:06:19.930 Ken Ebert: Hello! Good morning. Good morning. 33 00:06:20.100 --> 00:06:21.790 Keela Shatzkin: How are you, sir? 34 00:06:23.820 --> 00:06:31.069 trevorbutterworth: Oh, no, no, that it actually wasn't yeah, no easier for the yes, it was Japan. 35 00:06:31.970 --> 00:06:33.779 Keela Shatzkin: Hi, Simon, welcome. 36 00:06:35.040 --> 00:06:36.050 Simon Nazarenko: Hello! 37 00:06:36.380 --> 00:06:37.640 Keela Shatzkin: How are you? 38 00:06:38.990 --> 00:06:49.509 Keela Shatzkin: I'm tired. Yes, I hear. I hear there's a tiredness spreading among the team. 39 00:06:52.440 --> 00:06:53.450 Keela Shatzkin: Okay? 40 00:06:53.540 --> 00:07:02.030 Keela Shatzkin: Already. I think we probably have who we're going to have for today. And we have some 41 00:07:02.600 --> 00:07:06.530 Keela Shatzkin: team editing. We're going to do so. I will. I think I can start recording 42 00:07:07.090 --> 00:07:09.310 Keela Shatzkin: my loss. It's already recording hard to know. 43 00:07:11.720 --> 00:07:18.129 Keela Shatzkin: I think we might already be recording so excellent. We can jump in to 44 00:07:19.050 --> 00:07:23.169 Keela Shatzkin: today's housekeeping. Let me share my screen 45 00:07:26.000 --> 00:07:44.169 Keela Shatzkin: already. So as we have discussed, we've moved all of our housekeeping things over to the hyper ledger official toolkit. So here we are. these are the meetings for today's meeting. Welcome. June 20, s Cardi, a meeting 46 00:07:44.200 --> 00:08:00.490 Keela Shatzkin: we have on today's agenda to discuss the cardiac white paper and make appropriate edits and updating to that. Now that we're a part of the Hyper Ledger family and have been working on sort of repositioning cardiac in this space. 47 00:08:01.170 --> 00:08:08.470 Keela Shatzkin: We also need to just briefly cover the antitrust policy which is Hyper Ledger's antitrust policy. We're not talking about explicit 48 00:08:08.550 --> 00:08:11.450 Keela Shatzkin: business opportunities here. 49 00:08:12.030 --> 00:08:16.830 Keela Shatzkin: Speaking more generally, if anybody has concerns, please let Ken or myself know. 50 00:08:17.430 --> 00:08:36.329 Keela Shatzkin: And additionally, our code of conduct is to encourage participation from all people, all participants, and that this is a safe space for ideas and discussion. If anybody has concerns with that. Again, you can reach out to Ken or myself, or through Hyper Ledger directly. 51 00:08:37.100 --> 00:08:44.729 Keela Shatzkin: So, as I mentioned today's agenda, we've had a series of really wonderful 52 00:08:45.080 --> 00:08:53.840 Keela Shatzkin: guest speakers, and we hope to have more come and enlighten us about the work they are doing in this space, and how it may 53 00:08:54.110 --> 00:09:18.509 Keela Shatzkin: align with the work that we've started here at Korea. But today's session is really to revive that white paper that we wrote At this point I don't know. A year or 2 ago it's been a while my feel like 2 years, but it was a year ago a year. Okay, well, it's ripe for review. And so we're going to dive into that and then outline. our, your sort of coming goals as we enter the summer period. 54 00:09:18.680 --> 00:09:22.860 And what we want to do with all this wonderful information that we've been gathering. 55 00:09:23.490 --> 00:09:34.230 Keela Shatzkin: Is there anybody that would like to introduce themselves today? I think we have a small crew and mostly familiar. So if you want to say, Hi. please do otherwise. Well. 56 00:09:34.470 --> 00:09:36.309 Keela Shatzkin: we can jump into it, I think. 57 00:09:36.930 --> 00:09:40.390 trevorbutterworth: Okay, let me share my screen. 58 00:09:42.100 --> 00:09:46.859 trevorbutterworth: So I'm sharing it in acrobat because it's going to look a lot easier 59 00:09:46.910 --> 00:09:52.699 trevorbutterworth: on the eyes problem with Google docs, is it automatically sizes to 60 00:09:52.820 --> 00:10:04.340 trevorbutterworth: considerably less than the real size. And then people struggle to read, and then they say the fonts all wrong, and why you make it so small. And it's because Google has rendered it at 56%. 61 00:10:05.060 --> 00:10:13.680 trevorbutterworth: So we, this is not something that can be edited necessarily. Well, I can. But I wouldn't advise editing directly. I will take notes 62 00:10:14.060 --> 00:10:42.919 trevorbutterworth: so broadly. The the goal of this is to brand as a hyper ledger. Well, we needed to brand as hyper ledger. but also to re. You know, a lot has happened in the last year. we've And the the text need needed a refresh additional elements needed to be put in. And so here it goes. So one difference to the past white paper is the credits to be moved to the front. 63 00:10:43.250 --> 00:10:44.270 Keela Shatzkin: Hmm. 64 00:10:44.530 --> 00:10:54.510 trevorbutterworth: And we've also credit credit to you. We've tried to do a bit of a better job on crediting so 65 00:10:58.200 --> 00:10:59.390 Keela Shatzkin: excellent? 66 00:11:02.580 --> 00:11:05.639 trevorbutterworth: And and does this include 67 00:11:05.870 --> 00:11:25.489 Keela Shatzkin: in the credits do we need to include? I guess it's more history, though, the life cycle that we've went from Linux Foundation. Yes. Well, Linux Foundation Public Health, for it's encouraging initial support of the cardiac project. And in particular, Jim Sinclair and Jenny Wango. Is there anything do we need to say more than that? 68 00:11:26.220 --> 00:11:30.920 Ken Ebert: I think that adequately covers the past. And then. 69 00:11:30.930 --> 00:11:37.039 Ken Ebert: the next paragraph covers the Hyper Ledger Foundation for pulling us into that 70 00:11:39.080 --> 00:11:40.240 Keela Shatzkin: excellent 71 00:11:41.770 --> 00:11:46.330 trevorbutterworth: okay, moving on to page 3. So this is an overview. 72 00:11:50.530 --> 00:12:05.970 trevorbutterworth: It's not. I don't think it's it's not significantly. I mean, there's some tweaks, but it's not significantly changed to the in versus the previous overview. Again it notes the move to Hyper Ledger labs. 73 00:12:25.210 --> 00:12:31.430 Keela Shatzkin: This calls it machine readable governance rules at the very last paragraph, as where I see that. 74 00:12:31.450 --> 00:12:33.359 Keela Shatzkin: do we need to update that 75 00:12:34.320 --> 00:12:37.740 Ken Ebert: do decentralized ecosystem governance? 76 00:12:38.270 --> 00:12:43.750 Keela Shatzkin: I mean if you're not in this, maybe machine readable governance means something, but 77 00:12:44.110 --> 00:12:47.620 Keela Shatzkin: we've like branded it now. 78 00:12:48.960 --> 00:12:55.769 trevorbutterworth: So I I'm I'm not sure I'm remembering correctly some other thing. But 79 00:12:56.350 --> 00:13:00.420 trevorbutterworth: I you know I I is the the until. 80 00:13:00.470 --> 00:13:07.870 trevorbutterworth: The specification is formally, I think, adopted by the diff then decentralized ecosystem governance isn't 81 00:13:08.150 --> 00:13:13.780 trevorbutterworth: is, is in some kind of limbo as a term. 82 00:13:13.900 --> 00:13:26.250 Keela Shatzkin: Okay? And I think this is a lower case machine readable governance, right? It's like the fact that it's being done by the computer neutrally, potentially. I think that's fine. I just wanted to call it out. 83 00:13:27.020 --> 00:13:37.909 Ken Ebert: We're about a week away from ratification of this of the first draft of the oh, really, the first version of the machine readable governance. So I think, just need to 84 00:13:38.710 --> 00:13:44.150 Ken Ebert: take that into consideration. If you want to line up with that or not. 85 00:13:44.750 --> 00:13:52.739 trevorbutterworth: Maybe we need to. Maybe we leave it in this page, and then I don't know if we talk about machine readable governance anywhere else. We may want to transition 86 00:13:53.060 --> 00:13:54.770 Keela Shatzkin: and explain that. 87 00:13:54.880 --> 00:14:07.059 Keela Shatzkin: but that it's now under, you know that it's under the diss and things like that there's there's a lot. So so one of the things there's a there's the section on machine readable governance, slash, d Gove has been updated. 88 00:14:07.380 --> 00:14:09.290 Keela Shatzkin: So we get to that 89 00:14:09.670 --> 00:14:13.550 trevorbutterworth: imminently. Anything else on page 3, that 90 00:14:21.520 --> 00:14:24.360 trevorbutterworth: okay, well, that's just the 91 00:14:24.860 --> 00:14:26.060 trevorbutterworth: contents. 92 00:14:28.310 --> 00:14:34.719 trevorbutterworth: mission and background. this is type. The text here has been tightened up over the previous version. 93 00:15:00.430 --> 00:15:06.319 Keela Shatzkin: I think it looks okay. Can we pause for a second and just also establish the goals for reviewing this? 94 00:15:08.240 --> 00:15:09.909 Keela Shatzkin: We want to sort of 95 00:15:12.230 --> 00:15:14.329 Keela Shatzkin: shift to Hyper Ledger 96 00:15:14.390 --> 00:15:20.200 Keela Shatzkin: because it may be received by an audience now for the first time as a larger audience, and 97 00:15:22.860 --> 00:15:26.410 Keela Shatzkin: get some excitement going. Those are the 2 goals. 98 00:15:26.870 --> 00:15:29.840 trevorbutterworth: Well, also bring it, you know. 99 00:15:30.940 --> 00:15:51.959 trevorbutterworth: bring it a good, you know. Renovate in light of the previous year. and with fresh eyes on design and communication. So new graphics, etc., etc., which build on changes that have taken place over the years. So that it's it's a fresh code of paint. 100 00:15:52.590 --> 00:15:57.170 trevorbutterworth: with some additional, with with an additional extension couple of extensions 101 00:16:01.340 --> 00:16:15.130 Ken Ebert: in the the second to last paragraph on that side. it talks about to adapt and function for sharing and verifying multiple kinds of health data, I think, put in certain there, including consent. 102 00:16:15.730 --> 00:16:16.840 trevorbutterworth: Okay? 103 00:16:18.600 --> 00:16:24.220 Ken Ebert: Otherwise people might assume that it's just clinical data. 104 00:16:25.420 --> 00:16:29.090 Ken Ebert: Kila, what's your thought as a medical professional? What do you think? 105 00:16:30.990 --> 00:16:33.410 Keela Shatzkin: I think that's fine? I think that makes sense. 106 00:16:37.370 --> 00:16:39.779 Keela Shatzkin: and I don't know if we want to 107 00:16:41.380 --> 00:16:50.259 Keela Shatzkin: add more than that, because we haven't delved into those use cases extensively yet. But I do think the consent conundrum is a well known 108 00:16:50.650 --> 00:16:51.490 Keela Shatzkin: problem. 109 00:16:52.130 --> 00:16:54.070 Ken Ebert: Well, yeah. 110 00:16:54.200 --> 00:16:58.069 trevorbutterworth: And I mean, I I suspect when we have a 111 00:16:58.110 --> 00:17:07.090 trevorbutterworth: worked out how to deal with consent. We will need to do a version, 3 of this within a, with additional sec with a section on that. 112 00:17:07.910 --> 00:17:09.880 trevorbutterworth: But I suspect 113 00:17:09.930 --> 00:17:16.590 trevorbutterworth: we don't have a huge amount to say about how that is technically soluble at this point. 114 00:17:17.210 --> 00:17:22.739 trevorbutterworth: Well, I can't. Well, you that's that. That was a question more so than a statement. 115 00:17:28.130 --> 00:17:35.909 Keela Shatzkin: I think, adding, the mention is fine, but you're right. That more elaboration will be needed when we fully flush out a consent, use like a 116 00:17:36.360 --> 00:17:38.760 Keela Shatzkin: consent, capture and release use case. 117 00:17:42.370 --> 00:17:44.740 trevorbutterworth: So this has been expanded a little bit. 118 00:17:58.730 --> 00:18:00.129 Ken Ebert: That's good to me. 119 00:18:00.630 --> 00:18:01.730 Keela Shatzkin: Yep, agreed. 120 00:18:01.850 --> 00:18:04.919 trevorbutterworth: I'm sorry I'm taking a couple of minor notes. 121 00:18:08.380 --> 00:18:09.659 trevorbutterworth: no. 122 00:18:11.080 --> 00:18:12.470 trevorbutterworth: So this is new. 123 00:18:13.920 --> 00:18:27.220 trevorbutterworth: again, it's it. It's it sort of illuminates section that we had. So this is an addition. 124 00:18:51.290 --> 00:18:53.899 Keela Shatzkin: And the message this is delivering 125 00:18:55.980 --> 00:19:00.109 Ken Ebert: the thing is that for medical 126 00:19:00.600 --> 00:19:06.239 Ken Ebert: most of the data will be personal data regardless of its source. So this one is less. 127 00:19:06.660 --> 00:19:16.540 Ken Ebert: there are. There are some organizational credentials that are indicated. 128 00:19:16.850 --> 00:19:26.640 Ken Ebert: but the the type of data that we're dealing with generally pushes towards making the personal data really large. And the the other data really small. 129 00:19:27.010 --> 00:19:29.499 trevorbutterworth: So access, graphic. 130 00:19:30.410 --> 00:19:41.350 Keela Shatzkin: Well, I'm still confused about exactly what the messages that is trying to convey. Is it saying that a verifiable data could be in any one of those little dots? 131 00:19:41.860 --> 00:19:44.809 Ken Ebert: Yeah. Verifiable data point? Yes. 132 00:19:46.170 --> 00:19:48.830 Ken Ebert: And and the the left half of the 133 00:19:49.110 --> 00:19:56.890 Ken Ebert: of the diagram up to people organizations, machine sensors, connected data makes much more sense and 134 00:19:57.350 --> 00:20:04.660 trevorbutterworth: for Cardi than the right half of the data. 135 00:20:06.360 --> 00:20:13.669 Ken Ebert: Yes, the the the part at the bottom, the interoperable preserving, and all that that makes sense to. 136 00:20:13.800 --> 00:20:22.899 Ken Ebert: And you could just get rid of the bubbles at the top side and move the interoperable and other stuff as attributes of the data 137 00:20:24.410 --> 00:20:27.529 Ken Ebert: or descriptions of our qualities of the data. 138 00:20:27.580 --> 00:20:31.210 Keela Shatzkin: And one ledger right equals, I'm not sure. 139 00:20:32.120 --> 00:20:44.510 Ken Ebert: unlimited digital identities holding the data off chain and verifiable credentials. So you contrast how much has to go on? The ledger in some ledger systems. All the data goes on the ledger. 140 00:20:44.850 --> 00:20:52.210 Ken Ebert: and in Cardi the keys go on the ledger, and all the data goes to to the off chain wallet. 141 00:20:52.940 --> 00:20:54.929 Keela Shatzkin: But that isn't clear. 142 00:20:55.270 --> 00:21:02.340 Ken Ebert: Okay, for those of us. Looking at the site for this first time, I would say, I missed that. Memo. 143 00:21:02.460 --> 00:21:17.809 Ken Ebert: okay, that that's a real good feedback. Because that that helps. I think you need. We need to show the dots off the ledger. If that's the point right. Put the personal Phi dots on this page. 144 00:21:18.030 --> 00:21:19.800 Ken Ebert: put the ledger on the 145 00:21:19.930 --> 00:21:43.629 Ken Ebert: on the left underneath the one ledger right, and then say all the rest of the data is off. Ledger put one dot on the going to the ledger, and all the rest going into the wallet. Is that that kind of captures. What? Yeah, exactly. We're trying to say that there's you do less on the ledger when we're talking about this, because it's decentralized, and it's held by the holder. But all these dots are sitting here on the ledger. I mean, it's just so. The 146 00:21:44.440 --> 00:21:57.350 Ken Ebert: I don't get the point of the one ledger right equals all these dots. 147 00:21:57.580 --> 00:22:04.430 Keela Shatzkin: Right? It's pretty. I like the idea of the slide, but it didn't deliver the intended message. 148 00:22:10.420 --> 00:22:14.449 Keela Shatzkin: Sorry. That's okay. That's what we're here for? 149 00:22:15.460 --> 00:22:16.120 Ken Ebert: Yeah. 150 00:22:19.380 --> 00:22:21.510 trevorbutterworth: So this is 151 00:22:21.720 --> 00:22:25.520 trevorbutterworth: again, slightly changed, mostly the same, but 152 00:22:25.660 --> 00:22:27.180 trevorbutterworth: more tightly edited. 153 00:22:50.650 --> 00:22:53.899 Keela Shatzkin: Oh, this is fun history. I'm learning new stuff. 154 00:22:55.830 --> 00:23:02.629 trevorbutterworth: So you do we? Are you telling me you didn't read the first one? I don't think Arpanet was in the first one. 155 00:23:57.420 --> 00:24:07.800 Keela Shatzkin: Do. Is there any value in that second column. It's talking about email as a central identity. Is there any value in mentioning 156 00:24:08.610 --> 00:24:17.339 Keela Shatzkin: the security there as well? That, like that email provider in theory has access to all of their stuff 157 00:24:23.370 --> 00:24:25.240 Keela Shatzkin: because it's centralized 158 00:24:26.320 --> 00:24:29.270 Keela Shatzkin: and housed by the that third party. 159 00:24:34.440 --> 00:24:36.619 trevorbutterworth: I let others adjudicate on that. 160 00:24:39.580 --> 00:24:46.059 Ken Ebert: yeah. The fact that it is stored in a database by the provider makes it subject to 161 00:24:46.710 --> 00:24:55.840 Ken Ebert: tax. And 162 00:24:55.900 --> 00:25:01.180 Ken Ebert: it provides the access to the email provider to 163 00:25:01.480 --> 00:25:08.060 Ken Ebert: observe all your data and also makes a target for heckers. 164 00:25:10.300 --> 00:25:12.920 trevorbutterworth: Okay. how about? 165 00:25:41.810 --> 00:25:52.909 Ken Ebert: Well, you in the second paragraph already make the database and attractive target for prime. But key. This other point is that the data is also available to email providers. A unique statement. 166 00:25:53.570 --> 00:25:54.710 Keela Shatzkin: right? 167 00:26:30.740 --> 00:26:33.109 Keela Shatzkin: Do we define federated identity. 168 00:26:37.280 --> 00:26:46.540 Keela Shatzkin: an authenticated login to to a user profile on one platform enables, do we need to do any more explanation of that? Or do we think our readers will connect those dots? 169 00:26:52.230 --> 00:26:57.379 trevorbutterworth: Should I just say, I, I mean, should I say, such as Google or Facebook. 170 00:26:57.550 --> 00:27:08.689 Ken Ebert: yeah, Google email using your Google email to log into a different thing to something totally unrelated. I don't know how to phrase that better. But using your Google email to log into other services. 171 00:27:09.410 --> 00:27:10.090 Keela Shatzkin: Yeah. 172 00:27:13.770 --> 00:27:18.120 Keela Shatzkin: just so that there's no room for gaps and interpretation there. 173 00:27:24.420 --> 00:27:26.550 trevorbutterworth: Okay, moving on 174 00:28:33.940 --> 00:28:34.870 trevorbutterworth: move on. 175 00:28:35.380 --> 00:30:03.709 Keela Shatzkin: I think we in the first 176 00:30:04.160 --> 00:30:08.980 Keela Shatzkin: column of comments, I think we need to 177 00:30:09.880 --> 00:30:11.870 Keela Shatzkin: be clear 178 00:30:13.630 --> 00:30:23.889 Keela Shatzkin: the cause we still have an if there's still an issuing right, there's an issue, or of the credential, why is that different than the Federated one? And I think. 179 00:30:24.360 --> 00:30:30.470 Keela Shatzkin: adding a sentence or 2 to highlight, that it's like issued, and now in the ownership of the 180 00:30:30.690 --> 00:30:31.720 Keela Shatzkin: holder. 181 00:30:32.500 --> 00:30:38.639 Keela Shatzkin: to then reuse in a in a disconnected way except to verify that trust. I don't know that that's 182 00:30:38.870 --> 00:30:40.640 Keela Shatzkin: explicitly clear. 183 00:30:49.220 --> 00:30:55.080 Keela Shatzkin: This is kind of good to do. But with a after a long time fresh set of eyes. 184 00:31:10.390 --> 00:31:13.980 Keela Shatzkin: Maybe it gets into it. In the second thing, I wasn't there yet. 185 00:31:16.280 --> 00:31:17.440 Keela Shatzkin: Second column. 186 00:31:26.090 --> 00:31:34.980 Ken Ebert: I think, in the second paragraph of we said. any governing authority and issue a digital credential. some 187 00:31:35.230 --> 00:31:43.360 Ken Ebert: to the to the holder, or something like that to to something like that, just to the holder. 188 00:31:43.960 --> 00:31:46.710 makes it clear that the data went to them. 189 00:31:59.720 --> 00:32:01.160 trevorbutterworth: Okay. 190 00:32:02.780 --> 00:32:08.759 Keela Shatzkin: yeah, because it does get into it a little bit more in the second column. But I think that transfer of 191 00:32:09.510 --> 00:32:13.180 Keela Shatzkin: the hand off, isn't it? Wasn't clear to me in that section over there. 192 00:32:21.070 --> 00:32:28.169 Ken Ebert: It's reiterated in that. We know. We can know that it was issued to the traveler in the top. Right? Paragraph. 193 00:32:29.940 --> 00:32:38.100 Ken Ebert: That kind of goes over it again. But I think I hint of it in the second paragraph. Yep. okay. 194 00:32:38.470 --> 00:32:40.350 Keela Shatzkin: thanks. Tremor. 195 00:33:25.110 --> 00:33:30.160 Ken Ebert: W. 3 C. Did come. Don't go in the same parentheses 196 00:33:31.950 --> 00:33:34.929 Ken Ebert: like the W. Threec. Out of there. 197 00:33:35.190 --> 00:33:37.589 trevorbutterworth: Yeah, I'm not sure why that was there. 198 00:33:37.920 --> 00:33:39.659 Ken Ebert: The Didcom is fine. 199 00:33:51.890 --> 00:33:53.480 Ken Ebert: I'm good with this page. 200 00:33:53.510 --> 00:33:54.640 trevorbutterworth: Okay. 201 00:35:12.080 --> 00:35:13.460 Ken Ebert: I like this page. 202 00:35:14.270 --> 00:35:16.630 Keela Shatzkin: sorry. What's the digital twins? 203 00:35:20.760 --> 00:35:25.559 Ken Ebert: smart cities and robots. Those are the digital twins. 204 00:35:26.000 --> 00:35:33.689 trevorbutterworth: digital digital twins are are sort of a replication of infrastructure in digital form. 205 00:35:33.810 --> 00:35:45.739 Ken Ebert: So if you have something that's not very intelligent, it doesn't have any any capabilities. It's an in out of an object. You can create a digital twin to represent it in the digital world. 206 00:35:46.220 --> 00:35:47.379 Ken Ebert: So if you have 207 00:35:49.830 --> 00:36:01.610 Ken Ebert: a box of goods, it can be ha! Have a digital identity assigned to it. And it's location or movement, or even though it's not capable of itself, interacting digitally 208 00:36:01.850 --> 00:36:03.030 Keela Shatzkin: got it 209 00:36:06.050 --> 00:36:13.360 trevorbutterworth: very big and industrial uses at this all. It's a growth area and industrial uses. 210 00:36:13.580 --> 00:36:25.960 trevorbutterworth: products, lines and things like that. 211 00:36:45.290 --> 00:36:55.869 Keela Shatzkin: I think we have a more missing a word. Sorry I jumped ahead a little bit after agents, because they manage information flow between parties. Have a fiduciary, I think it's and have. 212 00:36:56.300 --> 00:36:57.560 Keela Shatzkin: or that have 213 00:36:58.800 --> 00:37:01.260 Keela Shatzkin: parties and have is missing award in the middle. 214 00:37:02.170 --> 00:37:04.879 Keela Shatzkin: There in the agents. Paragraph. 215 00:37:09.790 --> 00:37:11.609 trevorbutterworth: yes, and have 216 00:37:33.070 --> 00:37:36.180 trevorbutterworth: okay. I'm moving anything else on this page. 217 00:37:42.610 --> 00:37:45.800 Ken Ebert: Mobile agents enable connections with mobile devices. 218 00:37:52.090 --> 00:37:54.709 trevorbutterworth: Mediate. That should be mediator agents, shouldn't it? 219 00:37:54.940 --> 00:37:56.520 Ken Ebert: yes. 220 00:38:04.820 --> 00:38:06.090 Ken Ebert: yeah, yeah. 221 00:38:06.890 --> 00:38:07.900 trevorbutterworth: Got you. 222 00:38:09.330 --> 00:38:10.460 Ken Ebert: I'm good with it 223 00:38:28.300 --> 00:38:30.200 Keela Shatzkin: not a triangle anymore? 224 00:39:24.500 --> 00:39:26.479 Keela Shatzkin: Oh, okay. 225 00:39:43.240 --> 00:39:50.300 do we need to put the labels? We defined the participants in this? Do we need to use that language? 226 00:39:50.920 --> 00:39:53.339 Keela Shatzkin: Who's the issuer, the holder. 227 00:39:54.080 --> 00:39:55.360 Keela Shatzkin: the verifier? 228 00:39:57.440 --> 00:39:59.370 Keela Shatzkin: Those labels aren't here. 229 00:40:05.170 --> 00:40:06.150 trevorbutterworth: Okay. 230 00:40:09.690 --> 00:40:15.079 Ken Ebert: they're partly there. So you have issue or agent and a verifier agent. You just don't have 231 00:40:15.270 --> 00:40:23.940 Ken Ebert: When you where you say at the top cloud or mobile holder agent that would kind of 232 00:40:25.100 --> 00:40:29.190 trevorbutterworth: I can put I I can we? 233 00:40:29.620 --> 00:40:32.300 trevorbutterworth: I can rework that to make that more explicit. 234 00:40:33.330 --> 00:40:40.760 Ken Ebert: I think it's it's If you had the holder agent in the Cloud or mobile holder agent, then I think you'd have all 3. 235 00:40:41.140 --> 00:40:54.159 Keela Shatzkin: Well, I was going to put holder under patient issuer under public health authority, and then verifiers over here. Yeah, that would make it. I think you can leave the agents, because I think that's the helpful, too. But 236 00:40:54.860 --> 00:40:57.119 Keela Shatzkin: having that clear would would also be good. 237 00:41:07.090 --> 00:41:10.789 Keela Shatzkin: So this is the one that's going to need some re-editing it in a week or 2 238 00:41:14.540 --> 00:41:17.829 trevorbutterworth: as the second. This is those 2 pages here on the governance. 239 00:42:06.780 --> 00:42:09.370 Keela Shatzkin: Should we change 240 00:42:09.480 --> 00:42:22.030 Keela Shatzkin: in the pair. Fourth paragraph, the benefit of machinery, local evidence, instead of as the science should, we changes the requirements or regulations change just to make it a little bit more neutral. 241 00:42:34.820 --> 00:42:42.940 Ken Ebert: So one of them is the to indicate that, as the data from the research says. 242 00:42:43.130 --> 00:42:52.440 Ken Ebert: it needs to be 24 h versus 48 h. That that's that part of it. The regulations are the ones that say you have to have a test or 243 00:42:52.470 --> 00:42:59.009 Ken Ebert: vaccination, or whatever. And so the 2 kind of play in tandem. So how do you express that 244 00:42:59.060 --> 00:43:01.640 Ken Ebert: that part of the the sciencey part of it 245 00:43:01.770 --> 00:43:04.639 Ken Ebert: in another way? If you wanted to use another way. 246 00:43:04.910 --> 00:43:08.020 trevorbutterworth: I I was a scientific evidence for requirements. 247 00:43:10.350 --> 00:43:11.929 Ken Ebert: Yeah, that would do, too. 248 00:43:16.090 --> 00:43:17.230 trevorbutterworth: Okay. 249 00:43:24.980 --> 00:43:28.320 trevorbutterworth: now the may. Yes. So there, 250 00:43:29.960 --> 00:43:40.120 trevorbutterworth: right. So I see see a problem. So I think we have to introduce the concept of decentralized ecosystem governance 251 00:43:40.220 --> 00:43:42.909 trevorbutterworth: more explicitly in the final paragraph 252 00:43:46.130 --> 00:43:51.360 trevorbutterworth: to make for this to for this next to make sense 253 00:43:52.270 --> 00:43:53.210 Ken Ebert: agreed. 254 00:43:56.230 --> 00:44:19.650 Keela Shatzkin: And maybe it's just, you know, as this concept has become more adopted, it's been rebranded. I I still think there's some value in referencing it as machine readable governance, and so getting rid of that entirely. I don't know that we need to do that. But we should probably explain it and then introduce its other name in this universe. 255 00:44:26.970 --> 00:44:34.359 Keela Shatzkin: Number one. Should that be encoded publishes in code. That's that's yeah. I see the problem there. 256 00:44:35.410 --> 00:44:37.240 trevorbutterworth: I think just publishes. 257 00:44:37.820 --> 00:44:38.520 Keela Shatzkin: Okay. 258 00:44:50.150 --> 00:44:51.760 Ken Ebert: yes, that's 259 00:44:51.910 --> 00:44:53.700 Ken Ebert: sounds good to me. 260 00:45:02.970 --> 00:45:03.930 trevorbutterworth: Move on. 261 00:45:04.220 --> 00:45:04.970 Ken Ebert: Yeah. 262 00:46:14.450 --> 00:46:19.260 Keela Shatzkin: Find that second paragraph a little stumbling. 263 00:46:20.020 --> 00:46:24.180 trevorbutterworth: The second one. 264 00:46:26.720 --> 00:46:31.630 Keela Shatzkin: It seems it's like a little circular. Perhaps it could be simplified 265 00:46:31.760 --> 00:46:36.639 Keela Shatzkin: like is a URL or web address. Maybe if that goes in parentheses. 266 00:46:39.590 --> 00:46:45.120 Keela Shatzkin: I'm not. I think it's trying to say that 267 00:46:45.190 --> 00:46:53.010 Keela Shatzkin: for something with an IP address. basically, it's gonna have a unique. Each one will have a unique did. 268 00:46:57.760 --> 00:47:03.479 Ken Ebert: No, I think it's a comparing contrast. Kind of thing. It's the if a device has an IP address 269 00:47:04.160 --> 00:47:07.790 Ken Ebert: and the digital identity begins with, they did. 270 00:47:09.930 --> 00:47:11.780 Keela Shatzkin: Okay. 271 00:47:11.930 --> 00:47:19.999 Keela Shatzkin: okay, I'll rework that if there's yeah, it just needs a little word smithing to simplify. It's 272 00:47:20.120 --> 00:47:24.979 Ken Ebert: approved by the world. Wide Web is actually recommended. 273 00:47:25.010 --> 00:47:26.170 trevorbutterworth: Okay. 274 00:47:26.370 --> 00:47:28.470 Ken Ebert: is there official language 275 00:47:29.510 --> 00:47:34.150 Ken Ebert: kind of a nuance, but it will make the world people happier. 276 00:47:43.790 --> 00:47:49.390 Steve Davis (Shatzkin Systems): I think there's like 4 synonyms there. And in that same paragraph to your point that you have. 277 00:47:49.460 --> 00:47:52.909 Steve Davis (Shatzkin Systems): You could just. you know, website and web address 278 00:47:52.960 --> 00:48:03.089 Steve Davis (Shatzkin Systems): is a little bit extra redundant. But you could just say, IP address comma URL comma or web address. Those are 279 00:48:03.760 --> 00:48:04.910 Steve Davis (Shatzkin Systems): synonyms. 280 00:48:12.350 --> 00:48:26.000 Keela Shatzkin: you know. The web addresses the zoom room for a URL, is it not? I think it? It's an IP address as a of a different layer. 281 00:48:26.250 --> 00:48:33.060 Ken Ebert: But if you just said IP address and a or a URL, 282 00:48:33.390 --> 00:48:34.430 Ken Ebert: you can 283 00:48:35.450 --> 00:48:38.499 Ken Ebert: simplify the complexity a little bit. 284 00:48:38.570 --> 00:48:52.520 Keela Shatzkin: Yeah, maybe also not. If because it's missing the like second part of the if that thing, then the next thing I've missed the then the next thing part I don't know. There's something about that paragraph I've moved on. 285 00:49:00.450 --> 00:49:03.100 trevorbutterworth: have you? Are we still on this page? 286 00:49:04.480 --> 00:49:07.330 trevorbutterworth: Yeah, I guess we are. 287 00:49:16.000 --> 00:49:28.699 Ken Ebert: The mobile agents instead of can or create unique third from the bottom on the right, it should say, can create. They don't have to. They 288 00:49:31.510 --> 00:49:33.229 Ken Ebert: they probably should, but 289 00:49:35.900 --> 00:49:37.679 Ken Ebert: they're not obligated to 290 00:49:39.790 --> 00:49:40.820 trevorbutterworth: Gotcha. 291 00:49:51.200 --> 00:49:53.099 Ken Ebert: Nice, nice writing, though. 292 00:50:01.150 --> 00:50:02.850 trevorbutterworth: So I borrowed this from 293 00:50:03.890 --> 00:50:05.930 trevorbutterworth: other documents we've created. 294 00:50:39.880 --> 00:50:41.869 Ken Ebert: Yeah, this one's okay to me. 295 00:50:46.240 --> 00:50:47.210 Keela Shatzkin: Great. 296 00:50:48.550 --> 00:50:52.849 Steve Davis (Shatzkin Systems): that last slide. I, maybe this is just a nit pick, but 297 00:50:52.920 --> 00:51:03.209 Steve Davis (Shatzkin Systems): like you have Apis or insecure the third bullet point on the left side. 298 00:51:03.250 --> 00:51:09.380 trevorbutterworth: Oh, yeah, well, sorry, I interrupted. You say that again. 299 00:51:09.870 --> 00:51:18.209 Steve Davis (Shatzkin Systems): I was just gonna say I wouldn't. I wouldn't make that claim, because I think just blank it. Saying Apis or insecure is not accurate. 300 00:51:20.540 --> 00:51:21.720 Steve Davis (Shatzkin Systems): in my opinion. 301 00:51:23.040 --> 00:51:29.540 Steve Davis (Shatzkin Systems): so I don't know if you just delete that whole bullet point. Or 302 00:51:30.410 --> 00:51:33.180 Simon Nazarenko: yeah, I agree. was to. 303 00:51:34.780 --> 00:51:37.600 trevorbutterworth: So we going with it can be insecure. 304 00:51:38.790 --> 00:51:42.209 Ken Ebert: You have to do something to secure them 305 00:51:43.400 --> 00:51:59.370 Ken Ebert: in their raw state. They are not secured by default. They you have to do work to make them be secure. 306 00:52:00.080 --> 00:52:06.710 Keela Shatzkin: figuring out how to, and I think there could be a little like maybe aligning some of the language across the 2 307 00:52:06.910 --> 00:52:12.080 Keela Shatzkin: like the the bad things and the good things, so that they're more directly like checked off 308 00:52:17.370 --> 00:52:23.889 Steve Davis (Shatzkin Systems): set the next bullet point. After that, also you might want to change it to say could can be insecure, or 309 00:52:24.090 --> 00:52:25.490 Steve Davis (Shatzkin Systems): maybe, or something. 310 00:52:27.240 --> 00:52:32.450 Ken Ebert: So Sam was pretty adamant that the the 311 00:52:32.580 --> 00:52:39.580 Ken Ebert: if you're gonna have an Api to a mobile device, it is fairly, inherently insecure. There's not a lot you can do to fix it. 312 00:52:41.010 --> 00:52:41.830 Steve Davis (Shatzkin Systems): Okay. 313 00:52:43.520 --> 00:52:59.959 Ken Ebert: the the fact that servers do interesting things to secure their Apis. and is kind of a well known process to to address the security concerns. But you it's hard to do that on the Mobile the other way around. 314 00:53:02.120 --> 00:53:09.410 Ken Ebert: So if I want to talk to you, if I want to contact your mobile device. there's not a good Api based way to do it. Security. 315 00:53:13.630 --> 00:53:14.750 Keela Shatzkin: Okay. 316 00:54:11.370 --> 00:54:12.190 okay. 317 00:54:14.220 --> 00:54:16.109 Keela Shatzkin: I don't have any comments on this one. 318 00:54:17.380 --> 00:54:18.609 Ken Ebert: me, neither. 319 00:54:47.820 --> 00:54:53.379 Ken Ebert: My big gripe on this slide is that there are 2 periods after pars on the fourth point. 320 00:54:54.420 --> 00:54:56.690 Ken Ebert: Oh, good, nice catch 321 00:54:58.590 --> 00:55:04.119 Ken Ebert: I've I've I've I've I've seen a whole bunch of little things that I've taken notes on, but that was good good and miss that. 322 00:55:04.770 --> 00:55:16.209 Keela Shatzkin: I wonder also, in those heavy text slides that we did at the beginning. Do we want to make any mention about like see cryptographic signatures and privacy section. 323 00:55:18.480 --> 00:55:21.930 trevorbutterworth: Maybe. Yes. 324 00:55:21.970 --> 00:55:29.610 Keela Shatzkin: stringing things together throughout the document, but it may be helpful, because if it, they, people are getting lost in the words like this picture goes a long way. 325 00:55:29.930 --> 00:55:32.480 trevorbutterworth: yeah, yeah, yeah. 326 00:55:37.260 --> 00:55:38.290 trevorbutterworth: okay. 327 00:55:52.320 --> 00:55:58.329 trevorbutterworth: there might be a better way to lay this out, so the schema is more, clear. I think there is. 328 00:56:00.860 --> 00:56:06.280 Keela Shatzkin: Yeah, and maybe we get to it. But I haven't seen it yet. Is the schema 329 00:56:06.800 --> 00:56:09.570 Keela Shatzkin: relationship to the ecosystem 330 00:56:11.540 --> 00:56:18.179 Ken Ebert: the in must be revocable. Is not? That is not true? They don't have to be revocable 331 00:56:19.320 --> 00:56:21.430 Ken Ebert: down in the fourth paragraph. 332 00:56:38.260 --> 00:56:39.240 trevorbutterworth: Okay? 333 00:56:39.600 --> 00:56:45.410 trevorbutterworth: So I'm gonna read, I'm going to redo this slide to, because that's this is not doesn't really work. 334 00:56:46.480 --> 00:56:50.359 trevorbutterworth: This is version 2 of an attempt to make it work. But 335 00:56:52.520 --> 00:56:53.430 trevorbutterworth: move on. 336 00:56:54.470 --> 00:56:55.410 Ken Ebert: Yeah. 337 00:56:57.850 --> 00:57:00.490 trevorbutterworth: So this is all. This is all being redone 338 00:57:10.300 --> 00:57:12.079 trevorbutterworth: or created a new. 339 00:57:29.300 --> 00:57:39.199 Ken Ebert: some in the point. One part of it. I think we need to emphasize that paragraph 2 and 3 are done once 340 00:57:39.770 --> 00:57:44.770 Ken Ebert: and not done right now, it sounds like that happens each time a patient comes through the process. 341 00:57:48.660 --> 00:57:54.179 Ken Ebert: Okay, yeah, I would put the second and third paragraphs as item 0. 342 00:57:54.370 --> 00:58:09.579 Ken Ebert: And let's say that this happens at the very beginning and and set up in configuration, and then steps one which is the follow. It's Kyc process. And to it issues the thing. And 3 are the normal things that happen. 343 00:58:09.750 --> 00:58:12.090 Keela Shatzkin: Do we want to sorry. 344 00:58:12.710 --> 00:58:15.120 trevorbutterworth: So these are all basically one 345 00:58:16.460 --> 00:58:26.319 Ken Ebert: as opposed to 1, 2, 3. The second paragraph that writes the public, did it add a credential schema, those 2 paragraphs. 346 00:58:26.380 --> 00:58:31.370 trevorbutterworth: Okay, those are step 0, and they only happen one time at configure 347 00:58:31.590 --> 00:58:43.369 Ken Ebert: or set up. And that items a public health authority the issue or sends and scanning the QR code. Those 3 items go 1, 2, 3, every time a new patient shows up or does something 348 00:58:46.550 --> 00:58:48.330 Keela Shatzkin: a number 2, though, do we. 349 00:58:49.570 --> 00:58:52.700 Keela Shatzkin: It's very specific. They send an email. 350 00:58:53.240 --> 00:59:01.929 Keela Shatzkin: It doesn't have to be email. It also doesn't have to be sent to them individually. It can also be a generic QR code right where they can like register. 351 00:59:05.930 --> 00:59:09.370 Ken Ebert: how about the issue, or connects with the patient 352 00:59:09.940 --> 00:59:10.920 Ken Ebert: to 353 00:59:12.700 --> 00:59:14.670 Ken Ebert: issue the the credential 354 00:59:14.690 --> 00:59:17.909 Keela Shatzkin: or the issuer provides 355 00:59:18.250 --> 00:59:22.989 Keela Shatzkin: a QR code for establishing the initial connection. 356 00:59:24.630 --> 00:59:26.490 Ken Ebert: It doesn't have to be a QR code. 357 00:59:27.050 --> 00:59:32.540 Keela Shatzkin: There could be a link. 358 00:59:33.980 --> 00:59:37.010 Ken Ebert: And then it's following the 359 00:59:37.180 --> 00:59:43.240 Ken Ebert: QR code or invitation. URL will create a direct link encrypted link. 360 00:59:43.400 --> 00:59:48.719 Keela Shatzkin: Yeah, because it's not. We're not yet at the credential part. We're at establishing a connection. I think. 361 00:59:48.930 --> 00:59:49.810 Ken Ebert: Yeah. 362 01:00:05.460 --> 01:00:07.019 Keela Shatzkin: I also think there's 363 01:00:07.100 --> 01:00:14.030 Keela Shatzkin: it's very vague on the verifying. The person using the app is the same one 364 01:00:14.340 --> 01:00:17.480 Keela Shatzkin: that's there's like a lot there. 365 01:00:23.780 --> 01:00:26.890 trevorbutterworth: So I'm not. I'm not sure I follow what you what you're saying. 366 01:00:29.220 --> 01:00:44.249 Keela Shatzkin: well, there may be I an identity step in there where they're asking them to prove their identity they may have. If if the implementation pipeline is that or connection, pipeline is that 367 01:00:44.390 --> 01:00:54.080 Keela Shatzkin: I human vetted you, and so I know you are. I'm only offering this link, or, you know, onboarding method. 368 01:00:56.190 --> 01:00:57.780 Ken Ebert: I've got it. The person 369 01:00:58.120 --> 01:01:00.090 Keela Shatzkin: okay. 370 01:01:00.870 --> 01:01:05.220 Ken Ebert: So I'm only gonna give you the credential because I've already vetted you 371 01:01:06.960 --> 01:01:14.669 Keela Shatzkin: 3 doesn't happen, or 2 and 3 don't happen, and plus one is. 372 01:01:15.550 --> 01:01:24.859 Ken Ebert: So, how does the issuer server verify? The person using the app is the same one that registered as a patient. 373 01:01:24.900 --> 01:01:28.870 Keela Shatzkin: Okay? So then that goes away. And number 3, s sentence. 374 01:01:30.100 --> 01:01:33.460 Keela Shatzkin: okay. there's an assumption of 375 01:01:35.560 --> 01:01:37.830 Keela Shatzkin: based on step one. That being done. 376 01:01:39.800 --> 01:01:45.549 trevorbutterworth: Sorry I I I'm not sure what has to happen to 3 number. The second sentence goes away 377 01:01:46.380 --> 01:01:51.029 Ken Ebert: after connecting the issue, or server verifies that that something disappears. 378 01:01:54.900 --> 01:01:57.420 Keela Shatzkin: and we are at time 379 01:01:57.810 --> 01:02:02.169 trevorbutterworth: we were having so much fun. I lost track of that. 380 01:02:02.390 --> 01:02:13.669 trevorbutterworth: It looks like we're just about two-thirds of the way through. So, but we will need to discuss, because, 381 01:02:13.750 --> 01:02:20.329 trevorbutterworth: yeah, there's some substantial things. Let me just quickly. This, the next session. 382 01:02:20.480 --> 01:02:22.010 Keela Shatzkin: We certainly can. 383 01:02:22.370 --> 01:02:34.860 trevorbutterworth: Yeah, other rather than having you just try to blit through it. You have a link for this. I can put it in our notes and encourage people to go review the the second. 384 01:02:35.340 --> 01:02:41.429 Keela Shatzkin: Let me let me update Let me update it before sending it out. 385 01:02:42.620 --> 01:02:47.589 Keela Shatzkin: All the changes that so they changes can be reviewed as well. 386 01:02:49.340 --> 01:02:50.500 Keela Shatzkin: Thanks a lot. 387 01:02:50.800 --> 01:02:52.220 Steve Davis (Shatzkin Systems): all right. It's good. 388 01:02:52.670 --> 01:02:57.250 trevorbutterworth: Thanks, everybody. We'll see you in 2 weeks. 389 01:02:57.390 --> 01:03:01.529 Keela Shatzkin: in which will be in July. How exciting 390 01:03:02.090 --> 01:03:04.190 Keela Shatzkin: a fabulous day. 391 01:03:04.350 --> 01:03:05.440 Steve Davis (Shatzkin Systems): Hi. 392 01:03:06.020 --> 01:03:07.280 Simon Nazarenko: thank you. Bye.