Skip to content

Commit

Permalink
update data
Browse files Browse the repository at this point in the history
  • Loading branch information
abdelaziz-mahdy committed May 28, 2024
1 parent 954d3c7 commit b5c3918
Show file tree
Hide file tree
Showing 5 changed files with 106 additions and 99 deletions.
40 changes: 20 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,69 +55,69 @@ Visual comparisons for database endpoints and static endpoints are provided to s
# Database Endpoints

## Comparison Graph with db endpoints
![Comparison Graph](comparison_graph_db_test.png?v=1716888105)
![Comparison Graph](comparison_graph_db_test.png?v=1716893694)

## Detailed Graphs for each backend
- **go mux db_test**
![go mux db_test Benchmark Graph](backends/go/mux/tests/results/db_test/graph.png?v=1716888105)
![go mux db_test Benchmark Graph](backends/go/mux/tests/results/db_test/graph.png?v=1716893694)

- **c_sharp dot net db_test**
![c_sharp dot net db_test Benchmark Graph](backends/c_sharp/dot-net/tests/results/db_test/graph.png?v=1716888105)
![c_sharp dot net db_test Benchmark Graph](backends/c_sharp/dot-net/tests/results/db_test/graph.png?v=1716893694)

- **python django sync db_test**
![python django sync db_test Benchmark Graph](backends/python/django-sync/tests/results/db_test/graph.png?v=1716888105)
![python django sync db_test Benchmark Graph](backends/python/django-sync/tests/results/db_test/graph.png?v=1716893694)

- **python fast api db_test**
![python fast api db_test Benchmark Graph](backends/python/fast-api/tests/results/db_test/graph.png?v=1716888105)
![python fast api db_test Benchmark Graph](backends/python/fast-api/tests/results/db_test/graph.png?v=1716893694)

- **python django async db_test**
![python django async db_test Benchmark Graph](backends/python/django-async/tests/results/db_test/graph.png?v=1716888105)
![python django async db_test Benchmark Graph](backends/python/django-async/tests/results/db_test/graph.png?v=1716893694)

- **dart server pod db_test**
![dart server pod db_test Benchmark Graph](backends/dart/server-pod/tests/results/db_test/graph.png?v=1716888105)
![dart server pod db_test Benchmark Graph](backends/dart/server-pod/tests/results/db_test/graph.png?v=1716893694)

- **rust actix web db_test**
![rust actix web db_test Benchmark Graph](backends/rust/actix-web/tests/results/db_test/graph.png?v=1716888105)
![rust actix web db_test Benchmark Graph](backends/rust/actix-web/tests/results/db_test/graph.png?v=1716893694)

- **javascript express bun db_test**
![javascript express bun db_test Benchmark Graph](backends/javascript/express-bun/tests/results/db_test/graph.png?v=1716888105)
![javascript express bun db_test Benchmark Graph](backends/javascript/express-bun/tests/results/db_test/graph.png?v=1716893694)

- **javascript express node db_test**
![javascript express node db_test Benchmark Graph](backends/javascript/express-node/tests/results/db_test/graph.png?v=1716888105)
![javascript express node db_test Benchmark Graph](backends/javascript/express-node/tests/results/db_test/graph.png?v=1716893694)



# Static Endpoints

## Comparison Graph with static endpoints
![Comparison Graph](comparison_graph_no_db_test.png?v=1716888105)
![Comparison Graph](comparison_graph_no_db_test.png?v=1716893694)

## Detailed Graphs for each backend
- **go mux no_db_test**
![go mux no_db_test Benchmark Graph](backends/go/mux/tests/results/no_db_test/graph.png?v=1716888105)
![go mux no_db_test Benchmark Graph](backends/go/mux/tests/results/no_db_test/graph.png?v=1716893694)

- **c_sharp dot net no_db_test**
![c_sharp dot net no_db_test Benchmark Graph](backends/c_sharp/dot-net/tests/results/no_db_test/graph.png?v=1716888105)
![c_sharp dot net no_db_test Benchmark Graph](backends/c_sharp/dot-net/tests/results/no_db_test/graph.png?v=1716893694)

- **python django sync no_db_test**
![python django sync no_db_test Benchmark Graph](backends/python/django-sync/tests/results/no_db_test/graph.png?v=1716888105)
![python django sync no_db_test Benchmark Graph](backends/python/django-sync/tests/results/no_db_test/graph.png?v=1716893694)

- **python fast api no_db_test**
![python fast api no_db_test Benchmark Graph](backends/python/fast-api/tests/results/no_db_test/graph.png?v=1716888105)
![python fast api no_db_test Benchmark Graph](backends/python/fast-api/tests/results/no_db_test/graph.png?v=1716893694)

- **python django async no_db_test**
![python django async no_db_test Benchmark Graph](backends/python/django-async/tests/results/no_db_test/graph.png?v=1716888105)
![python django async no_db_test Benchmark Graph](backends/python/django-async/tests/results/no_db_test/graph.png?v=1716893694)

- **dart server pod no_db_test**
![dart server pod no_db_test Benchmark Graph](backends/dart/server-pod/tests/results/no_db_test/graph.png?v=1716888105)
![dart server pod no_db_test Benchmark Graph](backends/dart/server-pod/tests/results/no_db_test/graph.png?v=1716893694)

- **rust actix web no_db_test**
![rust actix web no_db_test Benchmark Graph](backends/rust/actix-web/tests/results/no_db_test/graph.png?v=1716888105)
![rust actix web no_db_test Benchmark Graph](backends/rust/actix-web/tests/results/no_db_test/graph.png?v=1716893694)

- **javascript express bun no_db_test**
![javascript express bun no_db_test Benchmark Graph](backends/javascript/express-bun/tests/results/no_db_test/graph.png?v=1716888105)
![javascript express bun no_db_test Benchmark Graph](backends/javascript/express-bun/tests/results/no_db_test/graph.png?v=1716893694)

- **javascript express node no_db_test**
![javascript express node no_db_test Benchmark Graph](backends/javascript/express-node/tests/results/no_db_test/graph.png?v=1716888105)
![javascript express node no_db_test Benchmark Graph](backends/javascript/express-node/tests/results/no_db_test/graph.png?v=1716893694)


66 changes: 0 additions & 66 deletions benchmark-app/processData.js

This file was deleted.

2 changes: 1 addition & 1 deletion benchmark-app/public/data.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion results_data.json

Large diffs are not rendered by default.

95 changes: 84 additions & 11 deletions scripts/graphs/graph_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ def process_file(file_path):
# Load the data from the provided file
data = pd.read_csv(file_path, on_bad_lines='skip')

data['timestamp'] = pd.to_numeric(data['Timestamp'], errors='coerce')
data['timestamp'] = pd.to_numeric(data['Timestamp'], errors='coerce').astype('int').apply(int)
# Convert Timestamp to datetime and then to seconds relative to the start
data['Timestamp'] = pd.to_datetime(data['Timestamp'], unit='s')
data['Timestamp'] = (data['Timestamp'] - data['Timestamp'].min()).dt.total_seconds()
Expand Down Expand Up @@ -43,7 +43,7 @@ def process_file_cpu_usage(file_path,summary):
data = pd.read_csv(file_path.replace("benchmark_stats_history.csv","cpu_usage.csv"), on_bad_lines='skip')

# Convert 'timestamp' to numeric type before converting to datetime
data['timestamp'] = pd.to_numeric(data['timestamp'], errors='coerce')
data['timestamp'] = pd.to_numeric(data['timestamp'], errors='coerce').astype('int').apply(int)
data['Timestamp'] = pd.to_datetime(data['timestamp'], unit='s')
data['Timestamp'] = (data['Timestamp'] - data['Timestamp'].min()).dt.total_seconds()

Expand Down Expand Up @@ -286,8 +286,19 @@ def get_adjusted_file_name(file_path):
# compare_and_plot(all_data, all_summaries, all_cpu)
for parent_dir in all_data:
compare_and_plot(all_data[parent_dir], all_summaries[parent_dir], all_cpu[parent_dir],custom_result_file_name="comparison_graph_"+parent_dir)
def merge_data_and_cpu(data, cpu):
# Ensure data and cpu are sorted by 'Timestamp'
# Assuming 'df' is your DataFrame
def print_all_columns_and_first_five_rows(df):
# Set display option to show all columns
pd.set_option('display.max_columns', None)

# Print the first 5 rows
print(df.head())
def merge_data_and_cpu(data, cpu, print_data=False):
if print_data:
print_all_columns_and_first_five_rows(data)
print_all_columns_and_first_five_rows(cpu)

# Ensure data and cpu are sorted by 'timestamp'
data = data.sort_values('timestamp').reset_index(drop=True)
cpu = cpu.sort_values('timestamp').reset_index(drop=True)

Expand All @@ -301,19 +312,79 @@ def merge_data_and_cpu(data, cpu):
cpu_length = len(cpu)

for index, row in data.iterrows():
# Move the CPU index forward if the CPU timestamp is less than the current data timestamp
while cpu_index < cpu_length and cpu.loc[cpu_index, 'timestamp'] <= row['timestamp']:
# Move the CPU index forward if the CPU timestamp is less than or equal to the current data timestamp
while cpu_index < cpu_length and float(cpu.loc[cpu_index, 'timestamp']) <= float(row['timestamp']):
cpu_index += 1
if print_data and cpu_index < 5 and cpu_index < cpu_length:
print({
"cpu": cpu.loc[cpu_index],
"cpu_index": cpu_index,
"cpu_timestamp": cpu.loc[cpu_index, 'timestamp'],
"data_timestamp": row['timestamp'],
"cpu_length": cpu_length
})

# If the CPU index is within the bounds and greater than the data timestamp, merge the CPU data
if cpu_index < cpu_length:
if cpu_index >= cpu_length:
break

if print_data and cpu_index < 5 and cpu_index < cpu_length:
print("DONE", {
"cpu": cpu.loc[cpu_index],
"cpu_index": cpu_index,
"cpu_timestamp": cpu.loc[cpu_index, 'timestamp'],
"data_timestamp": row['timestamp'],
"cpu_length": cpu_length
})

# If the CPU index is within the bounds and the CPU timestamp is greater than the data timestamp, merge the CPU data
if cpu_index < cpu_length and cpu.loc[cpu_index, 'timestamp'] > row['timestamp']:
if print_data and cpu_index < 5:
print("merging")
data.at[index, 'benchmark_cpu_usage'] = cpu.loc[cpu_index, 'benchmark_cpu_usage']
data.at[index, 'benchmark_mem_usage'] = str(cpu.loc[cpu_index, 'benchmark_mem_usage'])
data.at[index, 'db_cpu_usage'] = cpu.loc[cpu_index, 'db_cpu_usage']
data.at[index, 'db_mem_usage'] = str(cpu.loc[cpu_index, 'db_mem_usage'])

return data

def merge_data_and_cpu(data, cpu, print_data=False):
# Ensure data and cpu are sorted by 'timestamp'
data = data.sort_values('timestamp').reset_index(drop=True)
cpu = cpu.sort_values('timestamp').reset_index(drop=True)

# Initialize the columns in data for CPU usage and memory usage
data['benchmark_cpu_usage'] = None
data['benchmark_mem_usage'] = None
data['db_cpu_usage'] = None
data['db_mem_usage'] = None

cpu_index = 0
cpu_length = len(cpu)

for index, row in data.iterrows():
# Move the CPU index forward if the CPU timestamp is less than the current data timestamp
while cpu_index < cpu_length and cpu.loc[cpu_index, 'timestamp'] < row['timestamp']:
cpu_index += 1

if cpu_index >= cpu_length:
break

# Check if the previous CPU timestamp is less than the current data timestamp
if cpu_index > 0 and cpu.loc[cpu_index - 1, 'timestamp'] < row['timestamp']:
# Use the previous CPU index for merging as its timestamp is less than the data timestamp
prev_cpu_index = cpu_index - 1
data.at[index, 'benchmark_cpu_usage'] = cpu.loc[prev_cpu_index, 'benchmark_cpu_usage']
data.at[index, 'benchmark_mem_usage'] = str(cpu.loc[prev_cpu_index, 'benchmark_mem_usage'])
data.at[index, 'db_cpu_usage'] = cpu.loc[prev_cpu_index, 'db_cpu_usage']
data.at[index, 'db_mem_usage'] = str(cpu.loc[prev_cpu_index, 'db_mem_usage'])
else:
# Remove the CPU and memory fields by setting them to None
data.at[index, 'benchmark_cpu_usage'] = None
data.at[index, 'benchmark_mem_usage'] = None
data.at[index, 'db_cpu_usage'] = None
data.at[index, 'db_mem_usage'] = None
return data



def data_json(all_summaries, all_data, all_cpu):
Expand All @@ -323,7 +394,7 @@ def custom_serializer(obj):
raise TypeError(f'Object of type {obj.__class__.__name__} is not JSON serializable')

combined_data = {}

print_data=True
for parent_dir in all_data:
for path, data in all_data[parent_dir].items():
service_name = get_adjusted_file_name(path)
Expand All @@ -334,16 +405,18 @@ def custom_serializer(obj):
if isinstance(all_cpu[parent_dir][path], pd.DataFrame):
all_cpu[parent_dir][path].fillna(0, inplace=True)

merged_data = merge_data_and_cpu(data, all_cpu[parent_dir][path])
merged_data = merge_data_and_cpu(data, all_cpu[parent_dir][path],print_data)

combined_data[service_name] = {
'summary': all_summaries[parent_dir][path],
'data': merged_data
}

print_data=False

try:
all_data_json = json.dumps(combined_data, default=custom_serializer)
with open('/mnt/data/results_data.json', 'w') as file:
with open('/mnt/data/benchmark-app/public/data.json', 'w') as file:
file.write(all_data_json)

print("JSON data successfully written to file.")
Expand Down

0 comments on commit b5c3918

Please sign in to comment.