singlefabric
Model ServingOnline Inference Service

View Online Reasoning Service

Guide for managing online inference services, including how to view basic information, logs, and service details.

View Online Reasoning Service

Prerequisites

  • The management console account and password have been obtained.
  • An online inference service has been created.

Procedure

  1. Log in to the management console.
  2. In the top navigation bar, click Products and Services > AI Computing Platform > AI Computing Platform to go to its overview page.
  3. In the left navigation bar, select Inference Service > Online Inference Service to enter the Online Inference Service List page.
  4. On the inference service list page, you can view the basic information of all online inference services in the current platform.
Page InformationIllustrate
Service Name/IDService name: User-defined when creating an online inference service.
Service ID: Automatically generated by the system. Click the service ID to directly enter the details page of the inference service.
StateThe current status of the inference service, including waiting, creating, running, closed, failed, etc.
Resource ConfigurationWhen creating an inference service, users select resource specifications.
ModelThe name of the model deployed when creating an inference service.
ExamplesThe total number and normal number of Pod instances. The total number refers to the number of instances set by the user when creating an inference service and selecting resource configuration.
Access AddressThe access address of the successfully deployed model, which supports intranet access or extranet access.
Creation TimeThe time when the current inference service was created.
Update TimeThe time when the current inference service was updated.
OperateThe operations supported by inference services in different states are different, mainly including service details, closing, opening, and deleting.
  1. Click the service name/ID of an inference service (or click Service Details in the Operation column) to enter its details page.
  2. In the Service Information tab, you can view the basic information, instance information, and billing information of the current inference service.
  3. Select the Service Log tab to view the log information of all instances of the current inference service. You can also view specified log content based on the start and end time or by entering keywords in the search box.

On this page