WEBVTT 1 00:07:54.080 --> 00:07:56.240 Char Howland: Hi, Hussein. Thanks for joining the meeting. 2 00:07:57.950 --> 00:08:00.440 Char Howland: You will get started at the top of the hour. 3 00:08:04.290 --> 00:08:10.619 hossein namazian: Hey? Hey, Shan, how are you? I'm good. How about you? do that? 4 00:08:11.000 --> 00:08:12.329 Char Howland: Glad to hear it. 5 00:08:12.680 --> 00:08:22.629 hossein namazian: I see that no ideas. Yeah, yeah, we're a little early, but we'll have more people joining. 6 00:08:22.700 --> 00:08:29.270 Char Howland: so thanks for joining. I have you joined the call before? 7 00:08:30.260 --> 00:08:32.669 hossein namazian: no. 8 00:08:33.280 --> 00:08:38.629 Char Howland: I've been on some high pleasure. We talked before but 9 00:08:38.809 --> 00:08:40.929 hossein namazian: a few months. 10 00:08:41.870 --> 00:08:42.799 Char Howland: Great. 11 00:08:46.230 --> 00:08:51.409 Char Howland: nice! Well, thanks everyone for joining. Well. we'll wait a few minutes to get started. 12 00:08:57.560 --> 00:08:59.829 Nico: Excuse me, can I ask a question? 13 00:09:02.940 --> 00:09:18.510 Nico: Sure, sure. Yes, of course, I I my my laptop. But somehow Zoom says that I need a code is is that's correct? And is there a code available? I? Strangely, I can join through my desktop, which I'm doing now. 14 00:09:18.630 --> 00:09:27.899 Char Howland: Oh, interesting, are you? Could it be some sort of issue with the sign in because there you shouldn't need a code to enter. 15 00:09:30.020 --> 00:09:36.459 Nico: Okay, I'll I'll check that. Maybe something's wrong with the way I sold zoom or something. 16 00:09:36.480 --> 00:09:38.320 Nico: I'll check. Thanks. 17 00:09:40.370 --> 00:09:42.870 Char Howland: Thanks. Yeah. I I hope that works out 18 00:11:01.710 --> 00:11:02.560 everybody. 19 00:11:03.450 --> 00:11:05.259 Char Howland: Hi, Shawn, thanks for joining. 20 00:11:12.780 --> 00:11:19.579 Sean Bohan (Hyperledger): Give me 1 min, sure, and I will get. I'm gonna stop the recording. And then I'm going to start the stream and then restart the recording 21 00:11:20.100 --> 00:11:21.960 Char Howland: sounds great, wonderful 22 00:11:22.810 --> 00:11:26.989 Sean Bohan (Hyperledger): as a backup for zoom and recording. And I'm gonna set. 23 00:11:28.420 --> 00:11:31.700 Sean Bohan (Hyperledger): sure. And it's Tim here. 24 00:11:31.920 --> 00:11:32.920 Char Howland: Yes. 25 00:11:33.310 --> 00:11:40.760 Sean Bohan (Hyperledger): set both of you to co-host. just in case my co-host, just in case 26 00:11:42.410 --> 00:11:46.039 Sean Bohan (Hyperledger): something happens to my why isn't sure it? 27 00:11:49.750 --> 00:11:56.509 Sean Bohan (Hyperledger): Something happens, I get knocked out of the all right. Sure, take it away. It's all yours. 28 00:11:56.550 --> 00:12:06.550 Char Howland: Steven Current, our our great speaker, for the day here, so I'll I'll pass it off to Tim to walk us through the working group. Status updates. 29 00:12:08.130 --> 00:12:16.359 Tim Spring: Yes. Good morning, everybody. Just give me 1 s here to share my screen, so we can kind of all walk through it together. 30 00:12:20.970 --> 00:12:33.609 Tim Spring: so yeah, good morning. It is June fifteenth today, and on our call day will be reviewing. Some working group says updates. And then we also have a presentation from Steven Current. 31 00:12:33.750 --> 00:12:37.400 Tim Spring: So thank you for joining us today, Stephen, I appreciate it 32 00:12:38.300 --> 00:12:46.629 Tim Spring: quick reminder. We are under the hyper ledger antitrust policy. So please just be aware of that 33 00:12:47.020 --> 00:12:53.480 Tim Spring: few announcements. We have a few upcoming speakers. we have Nick Steele on the 20 ninth 34 00:12:53.600 --> 00:12:59.109 Tim Spring: Stephen Muy, on the thirteenth of July, and then Dimitri 35 00:12:59.250 --> 00:13:04.440 Tim Spring: Zealand on the 27. So if any of these 36 00:13:04.530 --> 00:13:09.179 Tim Spring: speakers look interesting to you, please be sure to come on back in a couple of weeks 37 00:13:10.860 --> 00:13:16.760 Tim Spring: and then we also have a hyper ledger in depth with the red data technology. 38 00:13:17.100 --> 00:13:21.700 Tim Spring: So it's on June 20 first. And anyone can register at this link right here. 39 00:13:23.510 --> 00:13:31.930 Tim Spring: So jumping right in, it looks like the Hyper Ledger Indie contributors working group met on the sixth. was anyone able to attend this call. 40 00:13:32.520 --> 00:13:49.580 Char Howland: yeah, I I was. I can give a brief update. But I wondered if it also might be useful to see if we have any introductions that anybody wants to make on the call I think we've got. We've got a a fantastic group here, and and probably a lot of new faces to this call. So if people 41 00:13:49.730 --> 00:13:57.569 Char Howland: want to take the mic and and introduce yourself and say, talk about your interest in in 42 00:13:57.580 --> 00:14:01.980 Tim Spring: in decentralized identity. that would be a great time. 43 00:14:04.230 --> 00:14:11.339 James Kempf: Hi, James Kemp, I'm the CTO wagon, which is a virtual power plant company and 44 00:14:11.360 --> 00:14:13.799 James Kempf: I think we might use this for 45 00:14:13.880 --> 00:14:18.350 James Kempf: IoT device, identification, or 46 00:14:18.550 --> 00:14:21.380 James Kempf: and potentially all also users. Thanks. 47 00:14:22.520 --> 00:14:28.790 Char Howland: Thanks for joining. That's great. my Novak. It looks like you have your hand up. 48 00:14:29.000 --> 00:14:42.719 mi novac: Yes, thank you. Good morning, everyone. My name is Michael Novak, and with the open voice network, and been a long time fan and student for digital identity. I'm very excited. I come out of the IoT world. 49 00:14:42.730 --> 00:14:53.289 mi novac: So I see this as being a key technology enabler for this. But also, recently you may have heard of generative AI in the news at least once or twice in the last 15 min. 50 00:14:53.490 --> 00:14:59.730 mi novac: And again, verifiable credentials and digital identity are perfect fit 51 00:14:59.830 --> 00:15:01.040 mi novac: for 52 00:15:01.160 --> 00:15:15.129 mi novac: conversational voice as well as taking in out of an object. So I'm really excited to check Stevens math today, and I'll run through generative. AI. Don't worry, Steven. I'll make sure you're doing it correctly. Okay. 53 00:15:15.140 --> 00:15:16.430 Stephen Curran: awesome. 54 00:15:16.820 --> 00:15:18.070 Char Howland: We're glad you're here. 55 00:15:21.940 --> 00:15:25.780 Char Howland: Any other introductions that anybody would like to make. 56 00:15:28.980 --> 00:15:50.590 Steve MM: Good morning. Steve. Michael is Martin from Boeing, Vancouver. We also are, working on a project to integrate the verifiable credentials and part of our authorization access. And I identity and access management authentication 57 00:15:50.590 --> 00:16:05.069 Steve MM: mechanism. sort of a variety of applications to it that we're researching at the moment. So this is a fantastic application I'm interested in. how this moves forward. So thank you 58 00:16:05.930 --> 00:16:07.990 Char Howland: absolutely. Thanks for joining Steve. 59 00:16:16.170 --> 00:16:20.809 Char Howland: I'll drop the meeting page link in the chat here. 60 00:16:21.010 --> 00:16:23.900 Char Howland: let's see oops. 61 00:16:25.390 --> 00:16:30.630 Char Howland: So I think I dropped the wrong link in the chat. But I will correct that right now. 62 00:16:32.990 --> 00:16:36.260 Char Howland: let's see so that Anybody can 63 00:16:36.740 --> 00:16:42.079 Char Howland: put their name on the attendees list if they would like 64 00:16:45.320 --> 00:16:56.679 Char Howland: so great unless anybody else has any announcements or introductions that they would like to make, we could continue on with the working group updates. 65 00:17:08.730 --> 00:17:36.260 Tim Spring: It sounds like that's about it. sure, I think we left off at the Indie contributor's call. give us a quick summary of that 66 00:17:36.260 --> 00:17:58.499 Char Howland: But at the same time we have a roadmap of things. We would love to add to it and and see implemented. So Just a a discussion about that that, I think, was really useful. And then we also spent time on the call going over open issues and Indie plenum. Steven, I don't know if you have anything you wanted to add about that discussion, or what we did on that call. 67 00:17:59.320 --> 00:18:01.109 Stephen Curran: No, that sounds about right 68 00:18:01.620 --> 00:18:08.120 Stephen Curran: looking forward to More discussions on, you know where it is going. And and 69 00:18:08.510 --> 00:18:13.479 Stephen Curran: you know how we can can figure out how to expand contributions. 70 00:18:13.950 --> 00:18:15.390 Char Howland: Yeah, absolutely 71 00:18:17.450 --> 00:18:27.809 Tim Spring: awesome. Well, sounds good. Thank you for sharing. it looks like the areas working group met just yesterday. Wasn't anyone able to attend the most er recent areas working group calls? 72 00:18:33.660 --> 00:18:44.400 Tim Spring: Okay? well, it looks like they're discussing some owf resolution. And did peer test. Not qualified migration stuff. If you want more info these links are. 73 00:18:44.590 --> 00:18:54.150 Tim Spring: we'll take you right to their notes. The Aries, by fault group met on the sixth. was anyone able to attend the areas by phone call? 74 00:19:02.410 --> 00:19:07.229 Tim Spring: Okay? It looks like they're working on an update and discussing some some key issues 75 00:19:07.600 --> 00:19:14.269 Tim Spring: the Aries Cloud agent. Python users Group and on the thirteenth was anyone able to attend this session? 76 00:19:20.090 --> 00:19:25.719 Char Howland: yeah, I I was there. talked about an update on the 77 00:19:25.910 --> 00:19:41.020 Char Howland: BC Gov code with us. that we had at in Dco are working on to update, occupy to use the Hyper ledger version of of a non cred. And so, talking about the refactors, got the Mvp. Of revocation completed. 78 00:19:41.280 --> 00:19:59.630 Char Howland: which was a big step. And and the things we're looking at next are the automated registry setup genericizing the Revocation registry registry recovery. test updates clean up as well. and we also talked about the 79 00:20:00.350 --> 00:20:12.669 Char Howland: 0 7, 2 final release and and merging Pr and acupy. Steven, I I don't know if you again have have anything you'd want to add to that summary. 80 00:20:15.830 --> 00:20:33.620 Stephen Curran: I can figure out how to unmute. yeah. 0 8 2 is ready. we've got some final things going into that And then we went into app by plugins and updates in progress. We're going on making stuff. And we added a new Maintainer to the project, which is pretty cool. 81 00:20:33.730 --> 00:20:39.710 Stephen Curran: All of those are covered there. Welcome folks to join us at the next meeting in 2 weeks 82 00:20:39.740 --> 00:20:41.450 Stephen Curran: 2 weeks from this one 83 00:20:43.860 --> 00:20:46.150 Stephen Curran: sounds good. Thanks for sharing. 84 00:20:49.480 --> 00:20:59.900 Tim Spring: I believe we've met. I'm not sure if this is the same day as our last meeting. but was anyone at the latest Aries framework Javascript call? 85 00:21:06.660 --> 00:21:14.690 Tim Spring: All right. Well, it looks like they're kind of discussing what the future of areas will look like and how to get started. for more details. You can click that link 86 00:21:15.210 --> 00:21:23.279 Tim Spring: versus getting it of life for ledger and on credits. Looks like they've been on the fifth Did anyone attend the a non credits call 87 00:21:23.770 --> 00:21:31.000 Stephen Curran: we've had a couple of meetings. Yeah, basically. Yeah, that 88 00:21:31.440 --> 00:21:51.479 Stephen Curran: we're moving forward on the stack. We now have a mentor part of the pro in the mentorship program working on the spec. So we're moving back forward and also some super interesting discussions on the 2.0 plans and some of the substitutions of new Z kp, stuff which we're going to be talking about soon 89 00:21:51.550 --> 00:21:56.400 Stephen Curran: on this call into an opera to. 90 00:21:57.480 --> 00:22:00.379 Stephen Curran: So the top is going on in in that working group. 91 00:22:03.390 --> 00:22:06.100 Tim Spring: Alright awesome. Thanks, Stephen. 92 00:22:06.560 --> 00:22:14.150 Tim Spring: it looks like T. IP. Hasn't been doing too much 93 00:22:21.650 --> 00:22:31.839 Tim Spring: looks like the diff did come. Spec working group met on the fifth. was anyone able to attend the the did top spec working group. 94 00:22:40.350 --> 00:22:47.450 Tim Spring: All right. Looks like they're working on ion compatibility for you to come 2.1. 95 00:22:47.480 --> 00:22:51.730 Tim Spring: And then they're working on some new marketing mission issues for Ddkov. 96 00:22:56.630 --> 00:22:58.449 Tim Spring: This is May 97 00:23:00.920 --> 00:23:10.070 Tim Spring: all right. Unless I'm mistaken. I believe that is all of our working group updates. does anyone have any general updates or groups that we've missed. 98 00:23:11.490 --> 00:23:13.740 Sean Bohan (Hyperledger): hey, Tim? Just 99 00:23:13.850 --> 00:23:27.440 Sean Bohan (Hyperledger): quick! Call out, the errors framework. Javascript. Recently, really 0 point 4.0 the team who's worked on that, including Ariel Bend and Kareem are going to give a demo slash workshop 100 00:23:27.550 --> 00:23:32.470 Sean Bohan (Hyperledger): on Wednesday, June 20, eighth. The link is in the 101 00:23:32.890 --> 00:23:42.289 Sean Bohan (Hyperledger): chat. it's not really going to be a hands on workshop, but they wanna really go in depth, and all the changes and and the new stuff that's in the new release. So that's coming up in 102 00:23:42.310 --> 00:23:44.489 Sean Bohan (Hyperledger): a little less than 2 weeks. 103 00:23:46.270 --> 00:23:48.420 Tim Spring: All right. Very cool. Thanks, Shawn. 104 00:23:50.000 --> 00:23:56.970 Tim Spring: Yeah. I'll give another a brief pause to see if there are any other updates or announcements before we hand it off to Steven. 105 00:24:04.730 --> 00:24:11.839 Stephen Curran: All right, Steven. The floor is yours. 106 00:24:11.970 --> 00:24:14.570 Stephen Curran: and I'll jump into the presentation 107 00:24:15.930 --> 00:24:19.649 Stephen Curran: Let me close. That 108 00:24:22.500 --> 00:24:30.409 Stephen Curran: got chatted open off to the side. So if anyone has comments, let me know. Let me leave that there. 109 00:24:32.210 --> 00:24:35.729 Stephen Curran: Well, not quite there, because I can see. There we go. 110 00:24:37.310 --> 00:24:39.589 Stephen Curran: all right. Can you see my screen? 111 00:24:42.230 --> 00:24:51.429 Char Howland: Yes, it looks great. 112 00:24:51.560 --> 00:25:04.019 Stephen Curran: I'm using that tweet slides a bit to adjust into it. But We'll share them up in the top corner if you want to. There's this bit lay d I dash the Kps, that's the 113 00:25:04.300 --> 00:25:12.390 Stephen Curran: link to the slides themselves. So if you want to grab those now or follow along I should probably put that in chat. But maybe someone can. 114 00:25:12.620 --> 00:25:31.810 Stephen Curran: so online identity with verify the credentials and then we'll get into the meat of it. Which is the Kp using high school maps to explain what zkp, 0 knowledge groups are. So I'll jump in. That's the agenda. A brief brief, since this is the identity. See, there's not a lot you need to know about 115 00:25:32.080 --> 00:25:38.500 Stephen Curran: online identities and then focus mostly on the 0 knowledge proof section and what they are. 116 00:25:39.090 --> 00:26:05.100 Stephen Curran: So financials, paper credentials are what we use in the world. That's what we use for 2,500 plus years. many of them are for identity. There's ones down here are things like professional. the attestations, professional credentials, like, you know, I'm an engineer. I'm an architect. I'm a doctor or a lawyer. Those types of things. There's supply chain. There's 117 00:26:05.590 --> 00:26:06.950 Stephen Curran: IoT 118 00:26:06.960 --> 00:26:30.890 Stephen Curran: certifications. that could be, for well, I guess those are definitely not paper but there's there's lots of paper credentials in the world, and the paper credential model is is one that, as they say, we use a an issue, or some sort of authority, gives a credential to a holder, that holder puts it into their wallet, or puts it into their filing cabinet, or or puts it somewhere, and sometime later. 119 00:26:31.320 --> 00:26:42.210 Stephen Curran: you know, separate transaction. A verifier wants to see that piece of paper for some some business purpose, and so the holder pulls it out of the wallet, or 120 00:26:42.230 --> 00:26:58.669 Stephen Curran: takes it down to the office of the the verifier, and shows them the piece of paper and the piece of paper in theory. and I put quotes around proofs who issued the credentials. So there's some sort of marker on credential that shows who issued it, who holds the credential. There's some sort of binding 121 00:26:58.700 --> 00:27:22.160 Stephen Curran: on the credential between the person presenting it and the credential itself, and some sort of verification. That the the claims are unchanged and proves is is done because the big thing here is is concerned with forgeries and things like that, that the holder somehow manipulated the document, either created it themselves or altered it in some way. 122 00:27:22.840 --> 00:27:39.109 Stephen Curran: trust is largely this between the holder and the verifier, but there's also the trust between the verifier and the issue, or the verifier chooses. What issuers? the credentials of what issue, or is there willing to correct them to accept. 123 00:27:39.760 --> 00:27:45.250 Stephen Curran: So in in, when I talked about that, there's both the technology and the governance 124 00:27:45.410 --> 00:28:01.409 Stephen Curran: aspect to it. Does the you know, technology. Does it look like it's on the right paper. Does it look like what that that organization that issue or organization produces? Does it look like there's, you know, ink marks on it where? I've changed my 125 00:28:01.520 --> 00:28:04.700 Stephen Curran: date of birth on my driver's license, so that I can 126 00:28:05.330 --> 00:28:11.970 Stephen Curran: use it for other purposes. And then the governance is, is things about what's the source of the of the 127 00:28:12.000 --> 00:28:34.179 Stephen Curran: authority of the issuer? Is it a trustworthy organization? What? Where the where does their authority come from when they issue a piece of paper? What are the processes they use for that? So those are all the things we talk about in identity. paper, identity paper credentials online are basically done these days by taking a picture of them and scanning of a nuts. 128 00:28:34.180 --> 00:28:42.109 Stephen Curran: And that's the where where we are today, generally with with the use of credentials digitally. 129 00:28:42.740 --> 00:28:45.760 Stephen Curran: What we want is a verifiable credential model. 130 00:28:45.770 --> 00:28:57.049 Stephen Curran: again, very. I think everyone here should be familiar with this. the issue, or provides a a, an, an issuance of a credential that is 131 00:28:57.060 --> 00:29:12.749 Stephen Curran: got some cryptographic. backing to it, they hand it to the holder. The holder. At some later time they, the holder, puts it in their wallet, pulls onto it. Their their digital wallet holds onto it some later time they present it to the verifier 132 00:29:13.460 --> 00:29:16.240 Stephen Curran: and there's a verifiable 133 00:29:16.410 --> 00:29:31.880 Stephen Curran: data registry. is is a place where cryptographic material goes, such that when the verifier gets the credential from the holder they are able to, verify. 134 00:29:31.880 --> 00:29:47.749 Stephen Curran: not by contacting the issue or in finding out whether it's about it. It's it's accurate whether it's it can be verified, but rather by going to some independent place to get information, such as public keys, and so on, to verify the cryptography. 135 00:29:47.840 --> 00:30:04.910 Stephen Curran: we use this list 1, 2, 3, 4, in fact. verifiable credentials with capital V. Capital C, as in defined by the W. 3 C. only talks about the first 2, which is, who issued the credential and the claims are unchanged. So there is a path 136 00:30:04.910 --> 00:30:19.369 Stephen Curran: to find out who issued the credential via the information in the the presentation, provide from the holders the verifier, and there is, a signature on it, a a cryptographic signature to verify the problems of a change. 137 00:30:19.380 --> 00:30:43.440 Stephen Curran: in the nonprofit world and places we work. There is a formal way of defining who holds the credential of binding between the person presenting the credential and the credential itself. How they are associated that in in An on cred is formally defined as part of the cryptography and other in W, 3 C, 138 00:30:43.470 --> 00:30:59.780 Stephen Curran: data data model standard that's outside of the stack, and has to be determined in some other way. So something like there's a picture of the person in it, and the that's the binding, or there's some or there's a a a a dig. 139 00:30:59.840 --> 00:31:17.220 Stephen Curran: and the person proves control over that did some sort of mechanism to bind it. And then as well. There's a fourth item which is available in some types of based on the issuers use case and and how the issue or handles it, which is, the claims have not been revoked. So those 140 00:31:17.270 --> 00:31:30.059 Stephen Curran: those are the proofs that come about way less concerned about whether the Holder force. It's almost. It's pretty much impossible to force those types of things much more on. Do you trust the issuer 141 00:31:30.100 --> 00:31:40.359 Stephen Curran: and So that's a big piece. There's also concerns about the software that goes along. So do you trust the issue or software? Do you trust the holders? Software? And so on. 142 00:31:40.700 --> 00:32:02.449 Stephen Curran: this is different from open id connect and log in by Facebook. So I did want to underline that when I talked about this for for those new to the topic. again, I think everyone knows that here. And the and the big issue is that the issuer is involved in every interaction. When you're using open id connect that the 143 00:32:02.580 --> 00:32:15.029 Stephen Curran: there is only a single process, and in that process the user sort of consents to both the issue or in the line party and the issue, or delivers the data directly, and of course, in a verifiable credential model 144 00:32:15.240 --> 00:32:37.530 Stephen Curran: on presentation, the issue, or is, is kept out of the picture, and the interaction is only between the issue and the rowing party. Okay, that's the background on verifiable credentials and what we're using them for. hyper ledger and non prince is a an instance of a a way to use verifiable connections to a verifiable credential type. 145 00:32:37.610 --> 00:32:45.189 Stephen Curran: it's a project that the Hopper Hyper Ledger foundation. there's a complete open source implementation of it in rust. 146 00:32:45.490 --> 00:32:54.440 Stephen Curran: that is based on the and on current specification. That is also being built and created in the Hyper Ledger foundation. 147 00:32:54.530 --> 00:33:18.900 Stephen Curran: this this implementation has a long history. hyper Ledger. Indie came out about 7 years ago. in the the self-serving identity stack, and nonprofit has been pulled out and revamped from that in the implementation that in the implementation itself derived from A from an Ibm implementation. So there's a long history of this. 148 00:33:19.130 --> 00:33:30.850 Stephen Curran: the big change that was implemented in pulling in our cards out of Vindi is verified on data, registry and agnosticism 149 00:33:30.930 --> 00:33:45.640 Stephen Curran: ledger agnostic, which means you do not have to use it. An indie ledger to store the objects necessary to have the and not create interactions. They could be published in a variety of places, and people have 150 00:33:45.640 --> 00:34:02.570 Stephen Curran: already published. a a. A such objects in a number of place outside of the Indy. Still the most, you know. Common place. You'll see them, but it's no longer a requirement. And and so that's a big bush that we're trying to do in the in the, in, on trace community. 151 00:34:02.570 --> 00:34:16.139 Stephen Curran: So what does it add to the picture which is privacy? And that privacy comes in a privacy preserving elements, and that comes in for for flavors. One is selective disclosure, so that when you have a credential 152 00:34:16.600 --> 00:34:20.140 Stephen Curran: that you've been issued and you present it. 153 00:34:20.467 --> 00:34:49.000 Stephen Curran: You don't have to present the entire documents so unlike a paper document where you hand over the paper document to be looked at. you can actually redact, if you will, some of the fields, and just present the things necessary for the business transaction you're conducting, and so The verifier can still see who issued it can still verify that it's the the various aspects of it. But they don't see all of the what data of the action use within them? 154 00:34:49.310 --> 00:34:55.610 Stephen Curran: predicate proofs so predicate proofs are where a 155 00:34:56.260 --> 00:35:07.140 Stephen Curran: this is the most obvious 0 knowledge proof where you prove that you are, for example, older than a certain age based on a date of birth in the credential, without 156 00:35:07.240 --> 00:35:23.460 Stephen Curran: sharing the data birth itself. So you're you're proving something in in the credential. But you're not actually sharing the data for it. And and by pro, you're not claiming or or suggesting self, a testing, you're actually proving it cryptographically. 157 00:35:24.590 --> 00:35:49.409 Stephen Curran: this is a paper in that. The of why a nonprofit is is really important is unlinkable identifier. So in in pretty much every other verifiable, credential model and approach. When you share a presentation, you're sharing unique identifiers either for yourself or for the credential itself. So the signal, if you 158 00:35:49.490 --> 00:35:58.390 Stephen Curran: are given a very a verifiable credential, and the way of presenting it is simply to show the other party the provincial itself. 159 00:35:58.420 --> 00:36:20.989 Stephen Curran: that the signature on it is a unique identifier. It's very, very much unique. And and so you're actually sharing a unique identifiers for it. And so what? and on press guys? These goes highly, very far out of its way to make sure that there is no linkable identifiers simply by presenting a verifiable credential. 160 00:36:22.710 --> 00:36:43.899 Stephen Curran: that's That is a key place where where zkps are. Use your knowledge proofs which can both to get into. We're getting there. which is that you can prove that the signature, for example, is valid on a verifiable credential without sharing signature itself. 161 00:36:43.900 --> 00:37:07.739 Stephen Curran: and again prove being the operative there. And finally, multi multi credential presentation so inherent in and on press is a that you can present multiple credentials at the same time and proof that they're tied together and do that all with selective disclosure. And again, that allows for a data minimization. If you need to prove that you're a lawyer and 162 00:37:07.750 --> 00:37:26.470 Stephen Curran: You know who you are as a as a resident of, say, British Columbia, and prove that you're a lawyer. You can present those 2 credentials, minimize the day share and and still prove those things and prove that they were both issued to you, or to your wallet. 163 00:37:27.020 --> 00:37:29.780 Stephen Curran: So that's the key features that are added. 164 00:37:29.790 --> 00:37:56.340 Stephen Curran: I should throw that I do throw in that I do a lot of my work. with the digital line energy team. And the Government of British Columbia, this slide sort of highlights. Why, government of British Columbia is so engaged in this. basically vc, and every other jurisdiction puts a ton of of focus on physical identity cards and 165 00:37:56.340 --> 00:38:07.250 Stephen Curran: and the importance they provided in underpinning the economy and and and life in in A, in a jurisdiction. the world is moving online. 166 00:38:07.470 --> 00:38:25.649 Stephen Curran: PC, therefore, is investing in figuring out the best ways to provide those same services, to make it safe for citizens to operate online for residents to operate online. And I highlighted do need to protect data, privacy and security. And that's in particular why DC is so interested in the on press. 167 00:38:26.010 --> 00:38:38.799 Stephen Curran: we, the the organization, wants to to keep trying to make it. That the approach used to verify the credentials is as private and secure as possible. 168 00:38:40.720 --> 00:38:55.309 Stephen Curran: With that we move on to the fun part, the high school math addition. $0 proof. So we're going to talk about. We're going to jump back to your your high school math and talk about how the the graphic proofs work with your knowledge. 169 00:38:55.770 --> 00:39:00.400 Stephen Curran: thanks to Professor Kazoo Sacco, who 170 00:39:00.460 --> 00:39:19.649 Stephen Curran: did the preliminary. You did early versions of this the first time I saw this type of thing, Mike Lauder from sovereign, and now it in other organizations, but very involved in the and on press community did a bunch of these. And actually, it was my daughter that did a lot of the slides and presentate and and 171 00:39:20.080 --> 00:39:40.809 Stephen Curran: math parts of these that you're going to see. So how? kudos to those. So what is this 0 knowledge? Proof? here's the quote. You know, a method. One party can prove to another party that they know about you. X, and we're going to talk about X a lot in this without convey any information apart from the fact that they know that value. 172 00:39:41.610 --> 00:39:52.999 Stephen Curran: it's as mentioned, it's the pro. It's the core of an ongrad, and that example that I give. You know I'm older than 19, based on my date of birth 173 00:39:53.230 --> 00:40:01.530 Stephen Curran: and but without sharing my date of birth. So one of the approaches used to to do. 174 00:40:01.640 --> 00:40:29.020 Stephen Curran: for instance, age, verification, and this is proposed in the Iso mbl model, and and and some of the things I've seen in in other places is oh, well, let's just put in. You know a a field that says older than 19, older than 21 older than 25. And so that's another way to get around that particular use case. And it is a super important use case. with an on correct. You actually put the data birth in. But the 175 00:40:29.090 --> 00:40:39.429 Stephen Curran: holder does not share the data birth. They just share a proof that they are older than a given page requested by the verifier. So that's what we're after. 176 00:40:39.490 --> 00:40:54.670 Stephen Curran: This is the interaction that happens. We got a holder prover that know some piece of information, and wants to prove it without revealing the value. Likewise, the verifier does not know. X. Wants to know. 177 00:40:54.730 --> 00:40:59.250 Stephen Curran: wants to verify that the prover knows X without 178 00:40:59.300 --> 00:41:22.249 Stephen Curran: learning about X itself. So both parties have a want to participate in this. So let's start with a nursery school addition. So this is an example of, you know, really getting simple with it. So you recall those who grew up in the age of Where's Waldo? Or or had kids that did 179 00:41:23.830 --> 00:41:46.850 Stephen Curran: relished in the knowledge that they knew where Waldo was on any particular page in the book, but they never wanted to let their friends know where they were, because then their friends could claim they found it themselves so. So what it so how do you do that prove that you know what a Waldo is, but not share where Waldo actually is. So the way you can do that is, make a sheet of paper. That's 4 times the size of 180 00:41:46.850 --> 00:41:59.640 Stephen Curran: the page in the Waldo book. Put a little hole in it, and then move the page the Waldo page around behind it, such that Waldo appears 181 00:41:59.710 --> 00:42:01.820 Stephen Curran: inside that little hole. 182 00:42:02.000 --> 00:42:12.429 Stephen Curran: The person looking at it can see wall, though they know that you know where wallow is, but they can't see where on the page the person is where a wall, though, is. 183 00:42:13.230 --> 00:42:17.190 Stephen Curran: So that's the simplest. The the nursery school, that is. 184 00:42:17.680 --> 00:42:26.739 Stephen Curran: 3 requirements of zkps completeness. If the statement is true and the honest verifier will be convinced that it's in fact 185 00:42:26.880 --> 00:42:30.820 Stephen Curran: it. It is known by the honest prover 186 00:42:31.050 --> 00:42:45.429 Stephen Curran: soundness. If the statement is false, no cheating prover can convince the honest verifier that this is true except with some small probability, and we're going to get to that in a bit probability involved in in. 187 00:42:45.780 --> 00:42:58.260 Stephen Curran: and finally, the 0 knowledge component. If the statement is true, the verifier learns nothing other than the fact that the statement is true. They don't actually under learn about the date of birth, the the the value underlying. 188 00:42:58.430 --> 00:43:10.999 Stephen Curran: So keep those in mind complete completeness, soundness and 0 knowledge attributes mentioned this little earlier. The Kps are actually probabilistic, not deterministic. 189 00:43:11.090 --> 00:43:13.710 Stephen Curran: You are not going to get a hundred percent 190 00:43:13.780 --> 00:43:24.220 Stephen Curran: knowledge. There you are going to get a a probabilistic, but we're getting pretty darn close, and you'll see that 191 00:43:24.250 --> 00:43:31.960 Stephen Curran: There's an element of randomness always in it which plays into how the 0 knowledge proof is is provided. 192 00:43:32.530 --> 00:43:42.570 Stephen Curran: And then we're gonna talk about the different forms of ckps, notably interactive Z. Kps and not interactive. Z Kps, 193 00:43:42.950 --> 00:43:48.130 Stephen Curran: foreshadowing a bit. Not interactive is better. We'll see why that is 194 00:43:48.950 --> 00:44:13.529 Stephen Curran: okay. High school math. here's where we get to the refresher for high school map. we need to cover functions and inverse functions. So we'll talk about functions. We'll talk about exponents and some of the rules of exponents because they come into play. very clearly in this the modulo operator and prime numbers. And basically these components 195 00:44:14.320 --> 00:44:28.099 Stephen Curran: that literally you covered. Probably in what what we in North America have is great. 10 math or 11 math are are all you need to know. Rsa, the dipy helman. 196 00:44:28.410 --> 00:44:37.509 Stephen Curran: Tiffy Howman algorithm shot 2, 56 hash. All of these cryptographic things are all based on these 197 00:44:38.010 --> 00:44:40.190 Stephen Curran: core components of the matter. 198 00:44:40.770 --> 00:45:06.979 Stephen Curran: So a function, a function is equation for which any X can be plugged in. And exactly one y comes out of the equation. 1. One result comes out of the equation so simple when there, f of x equals x plus 2. So if I put 25 in F of x is 27, if I put 2 in, it's 4, and so on. So all of these are examples of of functions. 199 00:45:06.980 --> 00:45:16.770 Stephen Curran: And basically you have in this case one variable that you would serve. You do the calculation, and you get your results out so easy stuff. You know that stuff? 200 00:45:17.490 --> 00:45:22.719 Stephen Curran: the inverse of a function is where you reverse it. So you 201 00:45:23.350 --> 00:45:42.870 Stephen Curran: given the output, how do you figure out what the input is? And and so you do the manipulations. You probably remember doing those? Oh, I can take the 2 over the other side by by converting the plus sign into a minus sign. So would remember that so X equals y minus 2, 202 00:45:43.590 --> 00:45:51.400 Stephen Curran: and we get the inverse function. So we've got our example of our original function. And we can calculate the inverse function 203 00:45:51.770 --> 00:46:02.209 Stephen Curran: in these ones and these examples and all of these ones. What you'll figure out is, it's pretty easy to go from the original function to the inverse and back. 204 00:46:02.650 --> 00:46:25.079 Stephen Curran: Those are all easy. What we want for Z. Kps is a function that is essentially impossible to invert. So what we want is something that we cannot do the inverse for, and that is a a core feature, a core requirement. And, in fact, what a lot of the work to cryptography is to find 205 00:46:26.430 --> 00:46:42.090 Stephen Curran: we'll see not just the not just the functions, but the after it, or the the the the the numbers, the types of numbers that that contribute to make it impossible to inter invert those 206 00:46:44.660 --> 00:46:55.929 Stephen Curran: functions inverse functions. Exponent, so exponent refers to the number of times is a a. A number is multiply by itself, so 2 times, 2 times 2 is 2 to the exponent through. 207 00:46:55.990 --> 00:47:12.610 Stephen Curran: So again, we remember that X to the fifth we got that so? Exponents pretty easy. You've seen those in regular life laws of exponents not exhaustive, but we've got a few that play are really important here. X to the 0 is one 208 00:47:13.290 --> 00:47:17.550 Stephen Curran: x to the one is X itself. You just drop the exponent off. 209 00:47:17.880 --> 00:47:26.420 Stephen Curran: x times a gives you X to the A plus B, this is, we're going to use a bunch. Actually, we're going to use all of these. But 210 00:47:26.500 --> 00:47:51.440 Stephen Curran: so again, the example expands out. Why, that is true. So you can see that to the third times, 2 to the second is actually 2 to the fifth. So adding the exponents together, and finally, X to the A to the B is x, a. Times B. So again, you can do the same sort of expansion out and see that that's true. So good. We got exponents covered. 211 00:47:51.540 --> 00:47:54.440 Stephen Curran: This is faster than you did in high school license. Back. 212 00:47:55.370 --> 00:48:12.070 Stephen Curran: my, about operator. This is modulo operator gives the remainder after the vision. So it's up done the same way as division. But the answer, rather than being, you know how many times this this 5 go into 17. 213 00:48:12.080 --> 00:48:20.199 Stephen Curran: rather. We. We care about the remainder. And so that's what you see here. 17. My 5 is 2, 214 00:48:20.710 --> 00:48:27.070 Stephen Curran: and those of you. Calculator, handy or quick with math. 3, 21 MoD. 17 equals 15. 215 00:48:29.340 --> 00:48:45.350 Stephen Curran: a lot of people like to think of this as the clock ticks. I I'm not so good on this one, but I put it up there because many people relate to it. But they basically you count the ticks around, and what you stop at is the modulo 12 of a number. So, 216 00:48:45.440 --> 00:48:55.910 Stephen Curran: 27, you go around twice to 12. You come back all the way to the 3 that that's the module of it. So there you go. That's the way to think of the modulo 217 00:48:59.000 --> 00:49:10.310 Stephen Curran: prime numbers. last one pretty easy a number divisible only by itself, and one so infinite number of these. 218 00:49:11.330 --> 00:49:22.149 Stephen Curran: basically prime numbers are pretty important in cryptography. And again, that that comes back to the need for these things to to make that thing version of the function. 219 00:49:22.420 --> 00:49:23.510 Stephen Curran: So 220 00:49:24.310 --> 00:49:37.289 Stephen Curran: we're gonna come back to it. But we're gonna start again with another example that's commonly used and and quite a good one. Alibaba's cave. And we're going to show that how a Z. Kp is interactive 221 00:49:37.350 --> 00:49:42.879 Stephen Curran: and probabilistic. So that's where these 2 concepts come into play with with this. 222 00:49:45.010 --> 00:49:56.179 Stephen Curran: So Holly Bob is Kate box to verify our Alice is the prover, because, of course, we can't have anything in this community without Alice and Bob being involved. in the cave. 223 00:49:56.350 --> 00:50:19.080 Stephen Curran: there's 2 paths through the cave A and B, and there's a magic door between them. So Alice is claiming to Bob that she knows the code to open the magic door, and she's gonna prove to Bob that she knows that. But she doesn't want Bob to know that code. She just wants to have it her as her own secret. She's not allowed to tell Bob that 224 00:50:19.360 --> 00:50:32.529 Stephen Curran: so the way Bob and Alice figure out to determine whether she knows it is Bob stands outside the King. Alice goes in, and then, as she goes in, she picks either A or B to go down. 225 00:50:33.770 --> 00:50:36.129 Stephen Curran: So in this case she picked a 226 00:50:36.250 --> 00:50:43.520 Stephen Curran: and then Bob A. A bob does not know which path Alice took. Bob stands there and says, Hey. 227 00:50:43.730 --> 00:50:48.789 Stephen Curran: Alice, come out one of the sides, hey, come out a. And so Alice 228 00:50:49.000 --> 00:50:57.679 Stephen Curran: goes out a, and that was easy, because she didn't even have to use her code. She just emailed A because she picked the same one Bob at. So 229 00:50:57.810 --> 00:51:12.709 Stephen Curran: Bob now has some evidence that Alice knows the code, because if she went in the she would have had to use the code, but you know she could have got an a so Bob really doesn't believe Alice yet. 230 00:51:13.690 --> 00:51:16.810 Stephen Curran: so let's do it again. 231 00:51:17.520 --> 00:51:25.890 Stephen Curran: Alice goes in again. Bob goes in again. This time, Bob says, Oh, come out, be thinking Alice is going to pick the same way in 232 00:51:26.600 --> 00:51:31.120 Stephen Curran: Alice can, of course, go in a since Alice knows the code. 233 00:51:31.310 --> 00:51:38.519 Stephen Curran: she uses the code, goes through the magic door, comes outside me, and Bob goes to me twice that worked. 234 00:51:39.010 --> 00:51:41.630 Stephen Curran: And now 235 00:51:42.550 --> 00:51:47.759 Stephen Curran: the way you get it is the interactive part. we've got probabilistic 236 00:51:48.370 --> 00:52:15.789 Stephen Curran: app we've got random this going in, Alice randomly picks A or B. We've got randomness from Bob Bob's randomly picking A or B to come out And and we've got interaction. we're having it repeated over and over, and every time Alice is coming out the wrong side the right side. And Bob now thinks, Well, there's no way Alice can be reading my mind and know which way I'm going to guess. 237 00:52:15.790 --> 00:52:24.980 Stephen Curran: So. I I'm getting pretty convinced as I do this, you know 1020, 30 times that probably Alice knows it. 238 00:52:25.110 --> 00:52:40.680 Stephen Curran: So again, this is the probabilistic nature. Bob doesn't know absolutely, deterministically, for sure, Alice knows the code. It's just extremely unlikely that she would have guessed the same thing that he that he suggested every time. 239 00:52:48.170 --> 00:52:56.149 Stephen Curran: So Alibaba's take completeness. If Alice honestly knows the secret code, Bob will eventually be convinced. She knows the code 240 00:52:57.150 --> 00:53:09.230 Stephen Curran: by probabilistic. and and that's done through repetition interaction. If Alice did not know the secret code is highly unlikely through repetition that she would be able to keep in spot. She knows it 241 00:53:09.630 --> 00:53:19.639 Stephen Curran: if ever the chance came that she went the wrong one, for what Bob convinced she can't come out the the correct side of the cave. She knows it. 242 00:53:19.770 --> 00:53:22.900 Stephen Curran: and of course $0. But in the secret code. 243 00:53:25.450 --> 00:53:36.070 Stephen Curran: So now we switch to math. Now we go over to to using those 4 elements of high school math, and and we figure out how they bob is cave. 244 00:53:36.100 --> 00:53:39.430 Stephen Curran: So the first thing we do is we need a one-way function. 245 00:53:39.470 --> 00:54:04.589 Stephen Curran: one where the inverse is essentially impossible. So coming back to that, if you know X. It's easy to find F of X. If you know F of X, it's pretty much impossible to go backwards and find X. And this function right here is the one that's commonly that it's used for $0 proofs. So G G. Is some public and known value. G is known by Allison. Bob. 246 00:54:05.170 --> 00:54:15.179 Stephen Curran: X, of course, is the number we're trying to figure out. And then we do modulo key on it. Where again, P is public and known value. And it's a prime. So 247 00:54:15.740 --> 00:54:18.820 Stephen Curran: Allison, Bob share G and P. 248 00:54:19.110 --> 00:54:34.859 Stephen Curran: only Alice knows X. And we're gonna use this for Bob to know that Alice knows that that Bob can determine that Alice really knows. X. 249 00:54:36.220 --> 00:54:39.730 Stephen Curran: So summary the steps Bob and Alice agree on G. And P. 250 00:54:40.040 --> 00:54:50.250 Stephen Curran: Alice knows X. And so Alice tells Bob F. Of X. And again confident that knowing F of X. Does not allow Bob to determine. X. 251 00:54:50.700 --> 00:55:04.770 Stephen Curran: Alice generates a random number are, so Alice picks one. This is the equivalent to Alice entering the cave and randomly choosing A or B out. Alice generates a random number, and calculates F of our and shares it with Bob. 252 00:55:05.280 --> 00:55:21.300 Stephen Curran: Bob Randomly sends Alice at a a constant c, which is either 0 or one in this first case. And again, this is the equivalent of Valley Bob is K. That's Bob saying, Come out a, or come out me 253 00:55:21.970 --> 00:55:34.410 Stephen Curran: Alice defines a new variable r plus x times. C. So Alice knows all of these items. So Alice knows the because she knows all of these things. 254 00:55:34.570 --> 00:55:37.269 Stephen Curran: and then she shares v of app 255 00:55:38.300 --> 00:55:46.339 Stephen Curran: the sorry of the with Bob again. Bob can't determine the 256 00:55:48.490 --> 00:56:06.049 Stephen Curran: Bob verifies the results by checking that F of our given by Alice F. Of X. Given by Alice to the C. Which Bob chose himself equals the F. Of V. That Alice share, and if the 2 sides match. Alice passes. 257 00:56:06.050 --> 00:56:22.329 Stephen Curran: So this is the key slide that says how it's done. And this is the the manipulation that goes on with the exponents. Basically F of R. Times, F of x times, C equals F of f of V. That's what we said we needed to check. So let's go down this side 258 00:56:22.550 --> 00:56:26.200 Stephen Curran: F of our expands to This 259 00:56:26.470 --> 00:56:35.320 Stephen Curran: half of back to the sea expands. To this. then, we use our rules of exponentiation. 260 00:56:35.620 --> 00:56:49.350 Stephen Curran: key of X to the C is G to the X time C, and then multiplying that by G to the r is adding to it. So we get r plus x times c, MoD, p 261 00:56:49.700 --> 00:56:57.710 Stephen Curran: and and that's our results on this side F of V is G to the v, MoD, P. 262 00:56:57.850 --> 00:57:04.669 Stephen Curran: And recall that the was calculated by Alice. It is r plus x times C, 263 00:57:04.750 --> 00:57:07.940 Stephen Curran: and here here we get these matching. 264 00:57:08.120 --> 00:57:20.510 Stephen Curran: The MoD. P is is a factor that just moves out. It's a common factor. Therefore we can move it out and and have it separated from the rest of the calculations of G, 265 00:57:20.630 --> 00:57:23.570 Stephen Curran: and as a result of that we get 266 00:57:23.700 --> 00:57:34.210 Stephen Curran: this way, that Alice, only Alice knows X out. Only Alice, those are And yet Bob can be confident that 267 00:57:34.920 --> 00:57:40.400 Stephen Curran: Alice knows is accurately representing those values. 268 00:57:42.640 --> 00:57:46.839 Stephen Curran: So here's some numbers on it. We're going to use really small numbers. 269 00:57:46.920 --> 00:58:00.449 Stephen Curran: X is 4 in this case don't tell anyone. Only Alice knows that. we're going to use this 5. This is public. 17 is the prime number. So G is our constant key is our our prime number. 270 00:58:00.660 --> 00:58:05.729 Stephen Curran: F of x, therefore, is 13. You can do the math on that. 271 00:58:06.010 --> 00:58:24.849 Stephen Curran: are this random number that Alice picks is 7 against. I'm only Alice knows that not Bob F. Of Bar is 10 bob, then declares either a 0 or a one. Randomly he chooses. It tells that to Alice, and then the 272 00:58:24.860 --> 00:58:34.449 Stephen Curran: is calculated only Alice knows it again because it's him. V is dependent on X and R, and only she knows it. 273 00:58:34.530 --> 00:58:43.940 Stephen Curran: She does that calculation, and then sends does that calculation, and so she can then do F of V. 274 00:58:46.140 --> 00:58:51.879 Stephen Curran: Here's the actual math for it. We get case, one of C equals 0, 275 00:58:51.950 --> 00:58:59.709 Stephen Curran: and we get V equals 7, and we get g to the V, MoD, P is 10 276 00:59:00.530 --> 00:59:04.789 Stephen Curran: bob verifies that Bob knows what's F of our 277 00:59:05.160 --> 00:59:14.509 Stephen Curran: F of X to the C. Well, if C is a 0, we know that anything to the 0 is one. 278 00:59:14.810 --> 00:59:23.809 Stephen Curran: So we wind up, this being 10 to the month 17, which is, of course, 10, and we get verification that these 2, 279 00:59:23.980 --> 00:59:26.960 Stephen Curran: Alice past few. Case one 280 00:59:28.090 --> 00:59:43.750 Stephen Curran: case 2. Here's here is where we use one, the only difference here being 13 to the one is 13. So this is 10 times 13 MoD. 17. And if you do the math, you'll find that's 11. Alice passes in either case. 281 00:59:44.610 --> 00:59:49.189 Stephen Curran: Now we have to repeat this process where we use a different, our. 282 00:59:49.400 --> 00:59:55.590 Stephen Curran: a different see over and over and over again, interactively. 283 00:59:56.410 --> 01:00:07.840 Stephen Curran: on the first pass there's a probability of a half of giving the correct value, because she knows Bob's going to send either A. C. 0 or a C one, so she can figure out what it is 284 01:00:07.990 --> 01:00:15.020 Stephen Curran: If they do this 20 times, it's a one in a million chance that Alice does not know hex 285 01:00:15.330 --> 01:00:28.790 Stephen Curran: The Alice kept guessing the right thing that Bob was gonna send a see, a a 0 or one. And then and so there's a one in a million chance that that that was sent to that was determined. 286 01:00:29.090 --> 01:00:34.910 Stephen Curran: so that's that's pretty close to being accurate, and only 20 times going through this 287 01:00:37.320 --> 01:00:39.910 Stephen Curran: generalizing. This 288 01:00:39.940 --> 01:00:51.419 Stephen Curran: instead of see being 0 or one we can see. Choose a C in the range of 0 to P minus one. Remember, P is our prime number. 289 01:00:51.560 --> 01:01:05.209 Stephen Curran: that we're using. So this is the equivalent of adding many path to Alibaba's cave. Alice has to choose which one of many, Alice or Bob says, come back 290 01:01:05.450 --> 01:01:15.060 Stephen Curran: one of me. And this reduces the number of iterations necessary to prove it. So basically, if you have 291 01:01:15.060 --> 01:01:36.179 Stephen Curran: from Professor Sacco's presentation. she basically explained it as basically each bit in C is an instance of a 0 or one iteration. In other words, if we can get see to be 20 bits of information. We've got it down to a one in a million. Chance. That Alice, yes. 292 01:01:36.670 --> 01:01:45.810 Stephen Curran: correctly, and and produce the right 0 knowledge proof, or the the right value to send to 293 01:01:46.130 --> 01:01:49.280 Stephen Curran: to by. 294 01:01:51.470 --> 01:01:59.579 Stephen Curran: But even if we get it down to one. back and forth, it's still an interactive process. Alice and Bob are still going back and forth. 295 01:01:59.900 --> 01:02:10.680 Stephen Curran: We want to get down to a simple request response process. Bob makes a request. Alison's approved Bob verifies it. So how do we eliminate 296 01:02:10.720 --> 01:02:18.449 Stephen Curran: that extra step where Alice? And Al Bob has to send that extra 297 01:02:18.490 --> 01:02:20.149 Stephen Curran: value? C, 298 01:02:21.230 --> 01:02:36.620 Stephen Curran: and that's One more piece of it which is figuring out how to do non non- interactive. Z, kp, so we've got interactive. We don't want to use repetition to reduce probability we want to get down to a single back and forth. 299 01:02:37.520 --> 01:02:43.669 Stephen Curran: The way that's done is a hash function. H. Again, a hash function, a and a non invertible. One. 300 01:02:43.820 --> 01:02:46.230 Stephen Curran: with a random number I 301 01:02:46.570 --> 01:02:52.170 Stephen Curran: Alice's use that to define C as F of our 302 01:02:52.380 --> 01:02:59.069 Stephen Curran: comma i, and and using that function that Bob Bob shared. So 303 01:02:59.170 --> 01:03:02.720 Stephen Curran: Bob and Alice both know H. The function works 304 01:03:03.150 --> 01:03:16.030 Stephen Curran: Bob provides as part of his request. I and Alice knows makes up. Our Alice just creates it because our is a random number that Alice chooses 305 01:03:16.400 --> 01:03:31.499 Stephen Curran: Bob. Still, no see, Bob can calculate. See once once F. Of R. Is no, and so it's a shared secret between them and share value between them. Bob and Alice both know it, and it is sufficiently random. 306 01:03:32.810 --> 01:03:39.599 Stephen Curran: I is used as to prevent me play attack. So those familiar with 307 01:03:40.040 --> 01:04:08.459 Stephen Curran: with cryptography and $0 proofs and and Para file credentials know about replay attacks with basically Alice and Bob. Alice requests approved from a bomb request approved from Alice. Alice prepares and sends that proof along the way. Some outside or Mallory. We often talk about Mallory, malicious Mallory Mallory listens in and records of the proof that Alice sent to Bob later. Bob? Asks Mallory for approved, and 308 01:04:08.990 --> 01:04:23.169 Stephen Curran: Mallory, replace the recorded proof from Alice, and claims it to be their own. And Bob can verify the proof, so it doesn't know. But by using a different I on every time a request is sent out. 309 01:04:23.220 --> 01:04:32.630 Stephen Curran: The proof that is received is different every time, and as a result, even if Mallory hears Alice's proof, 310 01:04:32.690 --> 01:04:44.680 Stephen Curran: Mallory can't play it back and pretend it's their own, because the eyes the I, the random factor that what's called the nonce is different. and that prevents a replay attack. 311 01:04:46.880 --> 01:04:48.729 Stephen Curran: So almost at the end 312 01:04:51.750 --> 01:05:09.279 Stephen Curran: oops, there we go in real life. this is what number P. Looks like in a. An. On Friday situation. He is a little larger than the 17 that we chose in our example. It's quite a large number, and that is a decimal number. 313 01:05:09.860 --> 01:05:22.480 Stephen Curran: C is between one and p, minus one. So given p, for the previous slide C is between one and this number. Remember that when we talked about C, 314 01:05:22.600 --> 01:05:36.970 Stephen Curran: see is a, each bit of C represents a a piece of entropy. So we said that you know. 20 bits of of of it would allow a one in a million chance that Alice 315 01:05:37.800 --> 01:05:48.639 Stephen Curran: luckily selected. Well, there's a whole lot more than 20 bits of information in this large decimal number. So a whole lot more likelihood. 316 01:05:48.670 --> 01:05:51.389 Stephen Curran: that see is 317 01:05:52.010 --> 01:05:59.570 Stephen Curran: it is is gonna give enough that that probable probability is extraordinarily low that Alice could pick it. 318 01:06:00.260 --> 01:06:06.120 Stephen Curran: And then in in real life, this is what G looks like. Again, another big, big, big, big, huge number. 319 01:06:07.980 --> 01:06:11.530 Stephen Curran: so that's that completes 320 01:06:12.130 --> 01:06:15.530 Stephen Curran: the coverage of the high school math part. 321 01:06:15.660 --> 01:06:45.270 Stephen Curran: And I'm just at the end of time. So that works of this session. So it works out well. Z Kps, as I mentioned, there's there's basically 4 commonly used. I go over a couple of them here blinding an identifier of the holder. This is the holder binding that ability that I talked about. That is sort of outside of the W. 3 C spec. Of how that gets done. But in an on cred it's very formally defined. And it's always the same. 322 01:06:45.270 --> 01:06:48.740 Stephen Curran: Basically, the holder has a link secret. 323 01:06:49.040 --> 01:07:00.669 Stephen Curran: a a big number. A blinded version is put into the verifiable credential. The holder approves the verifier, they know the link secret, but they don't actually reveal it. 324 01:07:00.720 --> 01:07:08.429 Stephen Curran: And then, secondly, they prove that the same linked secret is used with all the presented credentials. 325 01:07:09.600 --> 01:07:37.769 Stephen Curran: blinding the data values for selective disclosure is the same sort of thing in this case the issue or signs encoded versions of the data, and I don't know where I've I've mentioned this, but notice that everything I get X is a number X is always a number in 0 knowledge proofs. So as a result, if you want to do something with data involved in $0 proof. You have to encode that data as a number. And so a nonprofit has a 326 01:07:37.770 --> 01:07:47.080 Stephen Curran: encoding scheme that converts all of the attributes, all of the data elements into numbers. and so it's actually 327 01:07:47.240 --> 01:08:04.029 Stephen Curran: in an on credits. Actually, it is the encoded value that gets signed, not the actual data. So if you were assigned the encoded versions of the data, the numbers representing the data, the holder blinds the signatures in present in presenting those 328 01:08:04.270 --> 01:08:30.469 Stephen Curran: the holder proves to the verifier. It knows the signature without revealing the signatures, and then the holder reveals the raw data values of the attributes, and the verifier verifies, they can inform, they correspond to the signed values. So that's how selective disclosure works, and a little bit on how the signatures are blinded in and on credits. 329 01:08:30.729 --> 01:08:39.649 Stephen Curran: there's similar capabilities in predicate similar in revocation. But I just didn't think it was worth going through all the details of those 330 01:08:41.180 --> 01:08:45.430 Stephen Curran: pile of references for you. 331 01:08:45.960 --> 01:09:15.659 Stephen Curran: this was based on you know. I went to this I iw. 26 presentation by Professor Sacco. It was outstanding. this is a little more formal than she did. There's notes on the on on her presentation. She actually got asked to do a second instance of it. So there's notes, but not kind of the presentation and and the math. So that's this is helpful. Mike Lauder did a presentation when he was with the sovereign foundation. That's linked here, which is 332 01:09:15.660 --> 01:09:32.569 Stephen Curran: about a hundred 60 odd slides that includes detailed math of this a little. Yeah, okay, a lot more events than the high school map. But if you're interested in seeing all of the steps involved in this. that's a good presentation for that. 333 01:09:32.640 --> 01:09:39.049 Stephen Curran: Here's some other posts about cl signatures, which is the 334 01:09:39.450 --> 01:10:12.990 Stephen Curran: the academic paper upon which all of this is involved. the on. And I did that. And then some other interesting papers. Oh, David Chom, 1998. Paper on Blinded signatures, just to show you that this is not brand new stuff. this was the sort of basis upon which C. K. Z. Cash came about, and a bunch of the papers on blinded signatures. So lots of lots of things to look at there. Absolutely. I'll be sharing the presentation. 335 01:10:13.220 --> 01:10:29.800 Stephen Curran: why do they matter? Cts. no shared new identifier to present for governments. This is huge. not sharing creating the Ssn or the social insurance number, the social. 336 01:10:30.030 --> 01:10:36.429 Stephen Curran: those identifiers are are subject to leg legislation, and so on. Creating new ones is difficult. 337 01:10:36.670 --> 01:10:39.650 Stephen Curran: also the unlinkability 338 01:10:39.810 --> 01:10:59.899 Stephen Curran: minimize sharing. And again, unlinkability fighting back against online tracking. I'm at my time. So I better stop some other things in here. But and a call to actually get involved in this feel free to reach out to me if you're interested in want to get involved, want to learn more. 339 01:11:00.020 --> 01:11:01.529 Stephen Curran: welcome to do so. 340 01:11:01.600 --> 01:11:05.990 Stephen Curran: and with that I'll stop sharing and turn it over because we're at time. 341 01:11:12.620 --> 01:11:26.760 Char Howland: Thank you so much, Stephen. That was a a fantastic presentation super interesting. We covered a lot. So thank you. I think one of the questions is, if if the if people can access the slides after 342 01:11:26.860 --> 01:11:34.190 Char Howland: great sounds good, I can post those as well on the meeting page. So 343 01:11:34.230 --> 01:11:38.200 Char Howland: to go to this page 344 01:11:38.560 --> 01:11:42.799 Char Howland: in a moment, I will upload the slides there. So 345 01:11:43.470 --> 01:11:51.869 Char Howland: great. Yeah, thank you so much, Steven, and and thanks everyone for joining in with your working group updates. And we'll see you on 2 weeks. 346 01:11:53.650 --> 01:11:54.610 Thanks. 347 01:11:54.740 --> 01:11:56.730 Stephen Curran: Thank you.