Transcript (auto-generated)
I’m Jody Nelson at SRES Shorts and I’d like to discuss a little bit about documentation. I know for us engineers that’s like fingernails on a chalkboard for us because all we want to do is develop something, create something that we can touch and feel and see move. However, documentation, we talked about safety, security, responsible AI, is extremely critical. In some cases we’ll have regulations around documentation. Other cases we’ll have customers requiring external assessments, for example. We may have internal auditors reviewing our work. We need to create argumentations of why we think our product is safe, or why we think it’s secure, or why we believe we’ve been responsible when developing AI.
All these aspects have to be considered. And to do so, we need some kind of tool set. We need some kind of processes. In most cases, we can use whatever corporate tools that we’re already using. In other cases when we don’t have appropriate tools we should be discussing with our management or upper management to get these tools inside of the company. Now we do know a lot of assessors that we’ve worked with in the past really have a high level of focus on how we document things. When they see sloppy documentation things are located all over the place it looks like we’re scattered looks like we don’t know what we’re doing it looks like we’re not safe although the truth of it could be we are very safe underneath it but the appearance says a lot so it’s very critical that we create good documentation and don’t document just for documentation sake we should be creating high quality documents when we do safety analysis it’s just not to get a file to put into a folder to check a box.
That safety analysis should drive a safer product. It should find any kind of gaps we have in that product or it should argue why now this is why we believe we’re safe because look, we did the safety analysis and we don’t have any gaps. So, documentation overall is extremely critical in our development with safety, security, and responsible AI.