{"id":300,"date":"2025-07-20T11:10:28","date_gmt":"2025-07-20T11:10:28","guid":{"rendered":"https:\/\/humancompatible.org\/?page_id=300"},"modified":"2025-07-20T11:10:28","modified_gmt":"2025-07-20T11:10:28","slug":"open-source-toolkits","status":"publish","type":"page","link":"https:\/\/humancompatible.org\/index.php\/open-source-toolkits\/","title":{"rendered":"Open-Source Toolkits"},"content":{"rendered":"\n<p>humancompatible.org develops several toolkits within the <a rel=\"noreferrer noopener\" href=\"https:\/\/github.com\/humancompatible\/\" data-type=\"URL\" data-id=\"https:\/\/github.com\/humancompatible\/\" target=\"_blank\">humancompatible<\/a> organization in github and contributes to two flagship toolkits of the Linux Software Foundation. Our own toolkits include:<\/p>\n\n\n\n<p><a rel=\"noreferrer noopener\" href=\"https:\/\/github.com\/humancompatible\/detect\" target=\"_blank\">humancompatible.detect<\/a> is an open-source toolkit for detecting bias in AI models and their training data.<\/p>\n\n\n\n<p><a href=\"https:\/\/github.com\/humancompatible\/explain\">humancompatible.explain<\/a> is an open-source toolkit for counterfactual explanations with a variety of desiderata and focus on fairness. <\/p>\n\n\n\n<p><a rel=\"noreferrer noopener\" href=\"https:\/\/github.com\/humancompatible\/interconnect\/\" data-type=\"URL\" data-id=\"https:\/\/github.com\/humancompatible\/interconnect\/\" target=\"_blank\">humancompatible.interconnect<\/a> is an open-source toolkit for the modelling, simulations, and theorem proving within ergodicity of multi-agent systems.<\/p>\n\n\n\n<p><a href=\"https:\/\/github.com\/humancompatible\/repair\">humancompatible.repair<\/a> is an open-source toolkit for post-hoc verification of fairness and repair thereof.<\/p>\n\n\n\n<p><a rel=\"noreferrer noopener\" href=\"https:\/\/github.com\/humancompatible\/train\" target=\"_blank\">humancompatible.train<\/a> is a library of stochastic-constrained stochastic optimization algorithms for training AI systems with fairness guarantees. It serves as a plug-and-play replacement of PyTorch optimizers. <\/p>\n\n\n\n<p>Linux Software Foundation toolkits include:<\/p>\n\n\n\n<p><a rel=\"noreferrer noopener\" href=\"https:\/\/github.com\/Trusted-AI\/AIF360\" data-type=\"URL\" data-id=\"https:\/\/github.com\/Trusted-AI\/AIF360\" target=\"_blank\">AI Fairness 360<\/a> is an open-source toolkit developed by a wider research community to help detect and mitigate bias in machine learning models throughout the AI application lifecycle.<\/p>\n\n\n\n<p><a href=\"https:\/\/github.com\/Trusted-AI\/AIX360\" data-type=\"URL\" data-id=\"https:\/\/github.com\/Trusted-AI\/AIX360\">AI Explainability 360<\/a> is an open-source toolkit developed by a wider research community that supports interpretability and explainability of datasets and machine learning models.<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>humancompatible.org develops several toolkits within the humancompatible organization in github and contributes to two flagship toolkits of the Linux Software Foundation. Our own toolkits include: humancompatible.detect is an open-source toolkit [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":[],"_links":{"self":[{"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/pages\/300"}],"collection":[{"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/comments?post=300"}],"version-history":[{"count":3,"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/pages\/300\/revisions"}],"predecessor-version":[{"id":303,"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/pages\/300\/revisions\/303"}],"wp:attachment":[{"href":"https:\/\/humancompatible.org\/index.php\/wp-json\/wp\/v2\/media?parent=300"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}