# checkToxicityImage

## Description

Check toxicity of an image provided a valid URL. It has two different models to check the toxicity, one is \[bumble]\(<https://github.com/bumble-tech/private-detector>) and second is Yahoo \[Open NSFW]\(<https://github.com/yahoo/open_nsfw>) model.

## Parameters

| Parameter | Type   | Description |
| --------- | ------ | ----------- |
| imageUrl  | string | image URL   |

## Response

| Parameter | Type   | Description                                                                                                                                                                             |
| --------- | ------ | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| message   | string | message about the response                                                                                                                                                              |
| status    | int    | <p>1 for successfull execution</p><p>-1 for any error occurs during the execution of the request</p>                                                                                    |
| response  | object | <p>this will have three properties<br>bumble - toxicity score of bumble model<br>opennfsw - toxicity score of open nsfw model<br>is\_toxic - true for toxic and false for not toxic</p> |

## Example Request and Response

### Prerequisites

Before making requests with NO.AI SDK, you must have it installed.

You can install NO.AI SDK using either `npm` or `yarn`. Use the following commands to install NO.AI SDK:

```sh
npm install @nest25/ai-core-sdk
OR
yarn add @nest25/ai-core-sdk
```

## Request

Here is an example of how to make a `checkToxicityImage` request using the NO.AI SDK:

```javascript
// import the ai-core-sdk
import {AIServices} from '@nest25/ai-core-sdk';

// create a new instance of the sdk
const aiServices = new AIServices();

async function main() {
  // get the result of the test
  const result = await aiServices.checkToxicityImage('https://ik.imagekit.io/BIOSPHERE/1678716455079_PTj9bkO9d.jpeg');
  console.log(result);
}

main();
```

## Response

```json
{
    "message": "Request successful",
    "response": {
        "bumble": 94.26363110542297,
        "is_toxic": true,
        "nsfw": 99.0408182144165
    },
    "status": 1
}
```
