Use this pre-trained ccb-j1-cls computer vision model to retrieve predictions with our hosted API or deploy to the edge. Learn More About Roboflow Inference
Samples from Test Set
Inference is Roboflow's open source deployment package for developer-friendly vision inference.
Using Roboflow, you can deploy your classification model to a range of environments, including:
Below, we have instructions on how to use our deployment options.
To install dependencies, pip install inference-sdk
.
Then, add the following code snippet to a Python script:
python
from inference_sdk import InferenceHTTPClient
CLIENT = InferenceHTTPClient(
api_url="https://classify.roboflow.com",
api_key="API_KEY"
)
result = CLIENT.infer(your_image.jpg, model_id="ccb-j1-cls/1")
We're using axios to perform the POST request in this example so first run npm install axios to install the dependency.
javascript
const axios = require("axios");
const fs = require("fs");
const image = fs.readFileSync("YOUR_IMAGE.jpg", {
encoding: "base64"
});
axios({
method: "POST",
url: "https://classify.roboflow.com/ccb-j1-cls/1",
params: {
api_key: "API_KEY"
},
data: image,
headers: {
"Content-Type": "application/x-www-form-urlencoded"
}
})
.then(function (response) {
console.log(response.data);
})
.catch(function (error) {
console.log(error.message);
});
swift
import UIKit
// Load Image and Convert to Base64
let image = UIImage(named: "your-image-path") // path to image to upload ex: image.jpg
let imageData = image?.jpegData(compressionQuality: 1)
let fileContent = imageData?.base64EncodedString()
let postData = fileContent!.data(using: .utf8)
// Initialize Inference Server Request with API_KEY, Model, and Model Version
var request = URLRequest(url: URL(string: "https://classify.roboflow.com/ccb-j1-cls/1?api_key=API_KEY&name=YOUR_IMAGE.jpg")!,timeoutInterval: Double.infinity)
request.addValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
request.httpMethod = "POST"
request.httpBody = postData
// Execute Post Request
URLSession.shared.dataTask(with: request, completionHandler: { data, response, error in
// Parse Response to String
guard let data = data else {
print(String(describing: error))
return
}
// Convert Response String to Dictionary
do {
let dict = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any]
} catch {
print(error.localizedDescription)
}
// Print String Response
print(String(data: data, encoding: .utf8)!)
}).resume()
Look through our full documentation for more information and resources on how to utilize this model.
Use this model with a full fledged web application that has all sample code included.
Perform inference at the edge with a Jetson via our Docker container.
Utilize your model on your mobile device.