Skip to content

Commit

Permalink
Update tools Databricks notebooks notes about s3 event log support (#391
Browse files Browse the repository at this point in the history
)

* add notes about s3 log support

Signed-off-by: cindyyuanjiang <[email protected]>

* add s3 event log support in DB notebook note

Signed-off-by: cindyyuanjiang <[email protected]>

* addressed review feedback

Signed-off-by: cindyyuanjiang <[email protected]>

---------

Signed-off-by: cindyyuanjiang <[email protected]>
  • Loading branch information
cindyyuanjiang authored Jun 13, 2024
1 parent 4e4a923 commit b0686ed
Show file tree
Hide file tree
Showing 2 changed files with 50 additions and 48 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,9 @@
"To run the profiling tool, enter the log path that represents the DBFS location of your Spark GPU event logs. Then, select \"Run all\" to execute the notebook. Once the notebook completes, various output tables will appear below. For more options on running the profiling tool, please refer to the [Profiling Tool User Guide](https://docs.nvidia.com/spark-rapids/user-guide/latest/profiling/quickstart.html#running-the-tool).\n",
"\n",
"### Note\n",
"- Currently, only local or DBFS event log paths are supported.\n",
"- The DBFS path must use the File API format. Example: `/dbfs/<path-to-event-log>`.\n",
"- Currently, local, S3 or DBFS event log paths are supported.\n",
"- S3 path is only supported on Databricks AWS using [instance profiles](https://docs.databricks.com/en/connect/storage/tutorial-s3-instance-profile.html).\n",
"- DBFS path must use the File API format. Example: `/dbfs/<path-to-event-log>`.\n",
"- Multiple event logs must be comma-separated.\n",
"\n",
"### Per-Job Profile\n",
Expand All @@ -31,7 +32,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -55,7 +56,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -75,7 +76,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -121,7 +122,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -163,7 +164,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -187,7 +188,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -211,7 +212,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -281,7 +282,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -393,7 +394,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -506,50 +507,50 @@
"nuid": "8dddcaf7-104e-4247-b811-ff7a133b28d4",
"typedWidgetInfo": null,
"widgetInfo": {
"widgetType": "dropdown",
"defaultValue": "aws",
"label": null,
"name": "Cloud Provider",
"options": {
"widgetType": "dropdown",
"autoCreated": null,
"choices": [
"aws",
"azure"
]
}
],
"widgetType": "dropdown"
},
"widgetType": "dropdown"
}
},
"Eventlog Path": {
"currentValue": "/dbfs/user1/profiling_logs",
"nuid": "1272501d-5ad9-42be-ab62-35768b2fc384",
"typedWidgetInfo": null,
"widgetInfo": {
"widgetType": "text",
"defaultValue": "/dbfs/user1/profiling_logs",
"label": "",
"name": "Eventlog Path",
"options": {
"widgetType": "text",
"autoCreated": false,
"validationRegex": null
}
"validationRegex": null,
"widgetType": "text"
},
"widgetType": "text"
}
},
"Output Path": {
"currentValue": "/tmp",
"nuid": "ab7e082c-1ef9-4912-8fd7-51bf985eb9c1",
"typedWidgetInfo": null,
"widgetInfo": {
"widgetType": "text",
"defaultValue": "/tmp",
"label": null,
"name": "Output Path",
"options": {
"widgetType": "text",
"autoCreated": null,
"validationRegex": null
}
"validationRegex": null,
"widgetType": "text"
},
"widgetType": "text"
}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,14 +20,15 @@
"To run the qualification tool, enter the log path that represents the DBFS location of your Spark GPU event logs. Then, select \"Run all\" to execute the notebook. Once the notebook completes, various output tables will appear below. For more options on running the profiling tool, please refer to the [Qualification Tool User Guide](https://docs.nvidia.com/spark-rapids/user-guide/latest/qualification/quickstart.html#running-the-tool).\n",
"\n",
"### Note\n",
"- Currently, only local or DBFS event log paths are supported.\n",
"- The DBFS path must use the File API format. Example: `/dbfs/<path-to-event-log>`.\n",
"- Currently, local, S3 or DBFS event log paths are supported.\n",
"- S3 path is only supported on Databricks AWS using [instance profiles](https://docs.databricks.com/en/connect/storage/tutorial-s3-instance-profile.html).\n",
"- DBFS path must use the File API format. Example: `/dbfs/<path-to-event-log>`.\n",
"- Multiple event logs must be comma-separated.\n"
]
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -51,7 +52,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -71,7 +72,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -117,7 +118,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -159,7 +160,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -183,7 +184,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand All @@ -207,7 +208,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -277,7 +278,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -391,7 +392,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -442,7 +443,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -500,7 +501,7 @@
},
{
"cell_type": "code",
"execution_count": 0,
"execution_count": null,
"metadata": {
"application/vnd.databricks.v1+cell": {
"cellMetadata": {
Expand Down Expand Up @@ -590,50 +591,50 @@
"nuid": "8dddcaf7-104e-4247-b811-ff7a133b28d4",
"typedWidgetInfo": null,
"widgetInfo": {
"widgetType": "dropdown",
"defaultValue": "aws",
"label": null,
"name": "Cloud Provider",
"options": {
"widgetType": "dropdown",
"autoCreated": null,
"choices": [
"aws",
"azure"
]
}
],
"widgetType": "dropdown"
},
"widgetType": "dropdown"
}
},
"Eventlog Path": {
"currentValue": "/dbfs/user1/qualification_logs",
"nuid": "1272501d-5ad9-42be-ab62-35768b2fc384",
"typedWidgetInfo": null,
"widgetInfo": {
"widgetType": "text",
"defaultValue": "/dbfs/user1/qualification_logs",
"label": null,
"name": "Eventlog Path",
"options": {
"widgetType": "text",
"autoCreated": null,
"validationRegex": null
}
"validationRegex": null,
"widgetType": "text"
},
"widgetType": "text"
}
},
"Output Path": {
"currentValue": "/tmp",
"nuid": "ab7e082c-1ef9-4912-8fd7-51bf985eb9c1",
"typedWidgetInfo": null,
"widgetInfo": {
"widgetType": "text",
"defaultValue": "/tmp",
"label": null,
"name": "Output Path",
"options": {
"widgetType": "text",
"autoCreated": null,
"validationRegex": null
}
"validationRegex": null,
"widgetType": "text"
},
"widgetType": "text"
}
}
}
Expand Down

0 comments on commit b0686ed

Please sign in to comment.