RunPod
ActiveManage RunPod GPU instances and serverless endpoints. Deploy ML models, manage GPU resources, and run inference workloads.
What you can connect
Add these to your scene and AI gets access.
Account
Accounts
A connected RunPod account
GPU Pod
GPU Pods
A GPU-powered RunPod pod for AI/ML workloads
Tools
AI actions available through this integration.
runpod_exec
Execute a shell command on the GPU pod.
runpod_write_file
Write content to a file on the pod.
runpod_read_file
Read a file from the pod.
runpod_list_files
List files in a directory on the pod.
runpod_pod_status
Get the status of the current pod.
runpod_create_pod
Create a new GPU pod.
runpod_terminate_pod
Terminate the current pod.
runpod_stop_pod
Stop the current pod.
runpod_start_pod
Start a stopped pod.
runpod_list_pods
List all pods in the account.
Use cases
- Deploy and manage GPU instances
- Run ML inference workloads
- Monitor GPU resource utilization
Ready to try RunPod with Daslab?
Get started with CLI