Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run django db migrations for tests #729

Draft
wants to merge 13 commits into
base: main
Choose a base branch
from

Conversation

michelletran-codecov
Copy link
Contributor

This ensures that the table that the models run on are the Django models, rather than the slightly less accurate SQLAlchemy models.

Some of the fixes:

  • enforce constraints defined in DB
  • add default values for models for models that were not nullable
  • fixed factory side effects when specifying a default linked value (i.e. don't set foreign keys if we're not ready for that object to actually exist). This involved some hacks to set the value to null, flush to DB, then set the int as foreign key (but don't flush, as that's when that object gets created).
  • removed duplication (especially when creating objects that had unique constraints) by creating fixtures for some repeatedly created objects

Legal Boilerplate

Look, I get it. The entity doing business as "Sentry" was incorporated in the State of Delaware in 2015 as Functional Software, Inc. In 2022 this entity acquired Codecov and as result Sentry is going to need some rights from me in order to utilize my contributions in this PR. So here's the deal: I retain all rights, title and interest in and to my contributions, and by keeping this boilerplate intact I confirm that Sentry can use, modify, copy, and redistribute my contributions, under Sentry's choice of terms.

This was originally bootstrapping separately with Django and SQLAlchemy
creating separate databases. This ensures that Django bootstraps the test
database with (more canonical) Django models. SQLAlchemy will reuse the
database that Django bootstraps for testing purposes.
There are quite a few tables that are not moved to `shared` yet. So for the sake
of getting tests to work, I'm bootstrapping them with the SQLAlchemy models.
This is getting this more similar to how the models are defined in Django
This is because the tests were failing on duplicate keys. Fixtures reuses existing
objects, which avoids creating objects with duplicate keys.
This was failing on null values. Using the factory will fill in the
fields that are required.
This helps reduce database key conflicts by reusing existing objects for tests
This is because the actual DB value is timezone aware.
Copy link

codecov bot commented Sep 19, 2024

❌ 30 Tests Failed:

Tests completed Failed Passed Skipped
1695 30 1665 0
View the top 3 failed tests by shortest run time
tasks.tests.unit.test_process_flakes test_it_creates_flakes_fail_after_merge
Stack Traces | 0.007s run time
No failure message available
tasks.tests.unit.test_process_flakes test_it_creates_flakes_from_new_branch_only
Stack Traces | 0.007s run time
No failure message available
tasks.tests.unit.test_process_flakes test_it_handles_only_passes
Stack Traces | 0.008s run time
No failure message available

To view individual test run time comparison to the main branch, go to the Test Analytics Dashboard

@codecov-notifications
Copy link

codecov-notifications bot commented Sep 19, 2024

❌ 30 Tests Failed:

Tests completed Failed Passed Skipped
1695 30 1665 0
View the top 3 failed tests by shortest run time
tasks.tests.unit.test_process_flakes test_it_does_not_detect_unmerged_tests
Stack Traces | 0.007s run time
No failure message available
tasks.tests.unit.test_process_flakes test_it_creates_flakes_expires
Stack Traces | 0.008s run time
No failure message available
tasks.tests.unit.test_process_flakes test_it_handles_only_passes
Stack Traces | 0.008s run time
No failure message available

To view individual test run time comparison to the main branch, go to the Test Analytics Dashboard

@codecov-qa
Copy link

codecov-qa bot commented Sep 19, 2024

❌ 30 Tests Failed:

Tests completed Failed Passed Skipped
1695 30 1665 0
View the top 3 failed tests by shortest run time
tasks.tests.unit.test_process_flakes test_it_creates_flakes_fail_after_merge
Stack Traces | 0.007s run time
No failure message available
tasks.tests.unit.test_process_flakes test_it_creates_flakes_from_orig_branch
Stack Traces | 0.007s run time
No failure message available
tasks.tests.unit.test_process_flakes test_it_creates_flakes_expires
Stack Traces | 0.008s run time
No failure message available

To view individual test run time comparison to the main branch, go to the Test Analytics Dashboard

Copy link

codecov-public-qa bot commented Sep 19, 2024

Test Failures Detected: Due to failing tests, we cannot provide coverage reports at this time.

❌ Failed Test Results:

Completed 1695 tests with 30 failed, 1665 passed and 0 skipped.

View the full list of failed tests

pytest

  • Class name: services.notification.notifiers.tests.unit.test_comment.TestCommentNotifierWelcome
    Test name: test_build_message

    self = <test_comment.TestCommentNotifierWelcome object at 0x7f5b8c843830>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b83847c50>
    mock_configuration = <shared.config.ConfigHelper object at 0x7f5b83d7f080>
    mock_repo_provider = <MagicMock name='_get_repo_provider_service_instance()' spec='Github' id='140031031219680'>
    sample_comparison = <services.comparison.ComparisonProxy object at 0x7f5b886715e0>

    @pytest.mark.asyncio
    async def test_build_message(
    self, dbsession, mock_configuration, mock_repo_provider, sample_comparison
    ):
    mock_configuration.params["setup"]["codecov_dashboard_url"] = "test.example.br"

    notifier = CommentNotifier(
    repository=sample_comparison.head.commit.repository,
    title="title",
    notifier_yaml_settings={"layout": "reach, diff, flags, files, footer"},
    notifier_site_settings=True,
    current_yaml={},
    )
    result = await notifier.build_message(sample_comparison)
    expected_result = [
    "## Welcome to [Codecov](https://codecov.io) :tada:",
    "",
    "Once you merge this PR into your default branch, you're all set! Codecov will compare coverage reports and display results in all future pull requests.",
    "",
    "Thanks for integrating Codecov - We've got you covered :open_umbrella:",
    ]
    for exp, res in zip(expected_result, result):
    > assert exp == res
    E AssertionError: assert 'Thanks for i...pen_umbrella:' == ':information...uest-comment)'
    E
    E - :information_source: You can also turn on [project coverage checks](https://docs.codecov.com/docs/common-recipe-list#set-project-coverage-checks-on-a-pull-request) and [project coverage reporting on Pull Request comment](https://docs.codecov.com/docs/common-recipe-list#show-project-coverage-changes-on-the-pull-request-comment)
    E + Thanks for integrating Codecov - We've got you covered :open_umbrella:

    .../tests/unit/test_comment.py:5149: AssertionError
  • Class name: services.notification.notifiers.tests.unit.test_comment.TestCommentNotifierWelcome
    Test name: test_build_message_with_preexisting_bundle_pulls

    self = <test_comment.TestCommentNotifierWelcome object at 0x7f5b8c8287a0>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b88671550>
    mock_configuration = <shared.config.ConfigHelper object at 0x7f5b8844b2c0>
    mock_repo_provider = <MagicMock name='_get_repo_provider_service_instance()' spec='Github' id='140031033048816'>

    @pytest.mark.asyncio
    async def test_build_message_with_preexisting_bundle_pulls(
    self, dbsession, mock_configuration, mock_repo_provider
    ):
    mock_configuration.params["setup"]["codecov_dashboard_url"] = "test.example.br"

    owner = OwnerFactory.create(
    service="github",
    )
    repository = RepositoryFactory.create(owner=owner)
    branch = "new_branch"
    # artificially create multiple pull entries with BA comments only
    ba_pull_one = PullFactory.create(
    repository=repository,
    base=CommitFactory.create(repository=repository).commitid,
    head=CommitFactory.create(repository=repository, branch=branch).commitid,
    commentid=None,
    bundle_analysis_commentid="98123978",
    )
    ba_pull_two = PullFactory.create(
    repository=repository,
    base=CommitFactory.create(repository=repository).commitid,
    head=CommitFactory.create(repository=repository, branch=branch).commitid,
    commentid=None,
    bundle_analysis_commentid="23982347",
    )
    # Add these entries first so they are created before the pull with commentid only
    dbsession.add_all([ba_pull_one, ba_pull_two])
    dbsession.flush()

    # Create new coverage pull
    base_commit = CommitFactory.create(repository=repository)
    head_commit = CommitFactory.create(repository=repository, branch=branch)
    pull = PullFactory.create(
    repository=repository,
    base=base_commit.commitid,
    head=head_commit.commitid,
    )

    head_report = Report()
    head_file = ReportFile("file_1.go")
    head_file.append(
    1, ReportLine.create(coverage=1, sessions=[[0, 1]], complexity=(10, 2))
    )
    head_report.append(head_file)

    base_report = Report()
    base_file = ReportFile("file_1.go")
    base_file.append(
    1, ReportLine.create(coverage=0, sessions=[[0, 1]], complexity=(10, 2))
    )
    base_report.append(base_file)

    head_full_commit = FullCommit(
    commit=head_commit, report=ReadOnlyReport.create_from_report(head_report)
    )
    base_full_commit = FullCommit(
    commit=base_commit, report=ReadOnlyReport.create_from_report(base_report)
    )
    comparison = ComparisonProxy(
    Comparison(
    head=head_full_commit,
    project_coverage_base=base_full_commit,
    patch_coverage_base_commitid=base_commit.commitid,
    enriched_pull=EnrichedPull(
    database_pull=pull,
    provider_pull={
    "author": {"id": "12345", "username": "codecov-test-user"},
    "base": {"branch": "master", "commitid": base_commit.commitid},
    "head": {
    "branch": "reason/some-testing",
    "commitid": head_commit.commitid,
    },
    "number": str(pull.pullid),
    "id": str(pull.pullid),
    "state": "open",
    "title": "Creating new code for reasons no one knows",
    },
    ),
    )
    )
    dbsession.add_all([repository, base_commit, head_commit, pull])
    dbsession.flush()

    notifier = CommentNotifier(
    repository=comparison.head.commit.repository,
    title="title",
    notifier_yaml_settings={"layout": "reach, diff, flags, files, footer"},
    notifier_site_settings=True,
    current_yaml={},
    )
    result = await notifier.build_message(comparison)

    expected_result = [
    "## Welcome to [Codecov](https://codecov.io) :tada:",
    "",
    "Once you merge this PR into your default branch, you're all set! Codecov will compare coverage reports and display results in all future pull requests.",
    "",
    "Thanks for integrating Codecov - We've got you covered :open_umbrella:",
    ]
    for exp, res in zip(expected_result, result):
    > assert exp == res
    E AssertionError: assert 'Thanks for i...pen_umbrella:' == ':information...uest-comment)'
    E
    E - :information_source: You can also turn on [project coverage checks](https://docs.codecov.com/docs/common-recipe-list#set-project-coverage-checks-on-a-pull-request) and [project coverage reporting on Pull Request comment](https://docs.codecov.com/docs/common-recipe-list#show-project-coverage-changes-on-the-pull-request-comment)
    E + Thanks for integrating Codecov - We've got you covered :open_umbrella:

    .../tests/unit/test_comment.py:5253: AssertionError
  • Class name: services.notification.tests.unit.test_commit_notifications.TestCommitNotificationsServiceTestCase
    Test name: test_create_or_update_commit_notification_not_yet_exists_no_pull_but_ghapp_info

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b83b02600>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO codecov_auth_githubappinstallation (external_id, created_at, updated_at, name, owner_id, is_suspended) VA...reated_at)s, %(updated_at)s, %(name)s, %(owner_id)s, %(is_suspended)s) RETURNING codecov_auth_githubappinstallation.id'
    parameters = {'created_at': datetime.datetime(2024, 9, 20, 3, 10, 19, 999661, tzinfo=datetime.timezone.utc), 'external_id': UUID('8316f152-c288-426a-84ff-52d287ff1328'), 'is_suspended': False, 'name': 'codecov_app_installation', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b83b01e50>, [{'owner_id': 1392}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b6c502540>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b6c5027b0>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83983880; closed: -1>
    statement = 'INSERT INTO codecov_auth_githubappinstallation (external_id, created_at, updated_at, name, owner_id, is_suspended) VA...reated_at)s, %(updated_at)s, %(name)s, %(owner_id)s, %(is_suspended)s) RETURNING codecov_auth_githubappinstallation.id'
    parameters = {'created_at': datetime.datetime(2024, 9, 20, 3, 10, 19, 999661, tzinfo=datetime.timezone.utc), 'external_id': UUID('8316f152-c288-426a-84ff-52d287ff1328'), 'is_suspended': False, 'name': 'codecov_app_installation', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b6c5027b0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.NotNullViolation: null value in column "installation_id" of relation "codecov_auth_githubappinstallation" violates not-null constraint
    E DETAIL: Failing row contains (3, 8316f152-c288-426a-84ff-52d287ff1328, 2024-09-20 03:10:19.999661+00, 2024-09-20 03:10:19.999665+00, null, codecov_app_installation, null, 1392, null, null, f).

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: NotNullViolation

    The above exception was the direct cause of the following exception:

    self = <test_commit_notifications.TestCommitNotificationsServiceTestCase object at 0x7f5b8d4ad7f0>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b6c5010a0>
    comparison = Comparison(head=FullCommit(commit=Commit<12238f7ebb90735404e5e02376155f320f11ec39@repo<392>>, report=None), project_co...=None), patch_coverage_base_commitid='323123270c5f3fa23cb072c00959d4c546b9f47a', enriched_pull=None, current_yaml=None)

    def test_create_or_update_commit_notification_not_yet_exists_no_pull_but_ghapp_info(
    self, dbsession, comparison
    ):
    comparison.enriched_pull = None
    commit = comparison.head.commit
    app = GithubAppInstallation(owner=commit.repository.owner)
    dbsession.add(app)
    > dbsession.flush()

    .../tests/unit/test_commit_notifications.py:90:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1135: in _emit_insert_statements
    result = cached_connections[connection].execute(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83983880; closed: -1>
    statement = 'INSERT INTO codecov_auth_githubappinstallation (external_id, created_at, updated_at, name, owner_id, is_suspended) VA...reated_at)s, %(updated_at)s, %(name)s, %(owner_id)s, %(is_suspended)s) RETURNING codecov_auth_githubappinstallation.id'
    parameters = {'created_at': datetime.datetime(2024, 9, 20, 3, 10, 19, 999661, tzinfo=datetime.timezone.utc), 'external_id': UUID('8316f152-c288-426a-84ff-52d287ff1328'), 'is_suspended': False, 'name': 'codecov_app_installation', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b6c5027b0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.IntegrityError: (psycopg2.errors.NotNullViolation) null value in column "installation_id" of relation "codecov_auth_githubappinstallation" violates not-null constraint
    E DETAIL: Failing row contains (3, 8316f152-c288-426a-84ff-52d287ff1328, 2024-09-20 03:10:19.999661+00, 2024-09-20 03:10:19.999665+00, null, codecov_app_installation, null, 1392, null, null, f).
    E
    E [SQL: INSERT INTO codecov_auth_githubappinstallation (external_id, created_at, updated_at, name, owner_id, is_suspended) VALUES (%(external_id)s, %(created_at)s, %(updated_at)s, %(name)s, %(owner_id)s, %(is_suspended)s) RETURNING codecov_auth_githubappinstallation.id]
    E [parameters: {'external_id': UUID('8316f152-c288-426a-84ff-52d287ff1328'), 'created_at': datetime.datetime(2024, 9, 20, 3, 10, 19, 999661, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2024, 9, 20, 3, 10, 19, 999665, tzinfo=datetime.timezone.utc), 'name': 'codecov_app_installation', 'owner_id': 1392, 'is_suspended': False}]
    E (Background on this error at: http://sqlalche..../e/13/gkpj)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: IntegrityError
  • Class name: services.tests.test_billing.TestBillingServiceTestCase
    Test name: test_plan_not_pr_author

    self = <worker.services.tests.test_billing.TestBillingServiceTestCase object at 0x7f5b8c1303b0>
    request = <FixtureRequest for <Function test_plan_not_pr_author>>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b628a2420>
    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b628a7ef0>
    mock_configuration = <shared.config.ConfigHelper object at 0x7f5b628a6a20>
    with_sql_functions = None

    def test_plan_not_pr_author(
    self, request, dbsession, mocker, mock_configuration, with_sql_functions
    ):
    owner = OwnerFactory.create(service="github")
    dbsession.add(owner)
    dbsession.flush()

    > assert not is_pr_billing_plan(owner.plan)
    E AssertionError: assert not True
    E + where True = is_pr_billing_plan('users-basic')
    E + where 'users-basic' = Owner<1526@service<github>>.plan

    services/tests/test_billing.py:41: AssertionError
  • Class name: services.tests.test_timeseries.TestTimeseriesService
    Test name: test_commit_measurement_update_component

    self = <worker.services.tests.test_timeseries.TestTimeseriesService object at 0x7f5b8bfbd040>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b88436720>
    sample_report_for_components = <Report files=3>, repository = Repo<658>
    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b885d03e0>

    def test_commit_measurement_update_component(
    self, dbsession, sample_report_for_components, repository, mocker
    ):
    mocker.patch("services.timeseries.timeseries_enabled", return_value=True)
    mocker.patch(
    "services.report.ReportService.get_existing_report_for_commit",
    return_value=ReadOnlyReport.create_from_report(
    sample_report_for_components
    ),
    )

    commit = CommitFactory.create(branch="foo", repository=repository)
    dbsession.add(commit)
    dbsession.flush()

    get_repo_yaml = mocker.patch("services.timeseries.get_repo_yaml")
    yaml_dict = {
    "component_management": {
    "individual_components": [
    {
    "component_id": "test-component-123",
    "name": "test component",
    "flag_regexes": ["random-flago-987"],
    "paths": [r"folder/*"],
    },
    ],
    }
    }
    get_repo_yaml.return_value = UserYaml(yaml_dict)

    measurement = MeasurementFactory.create(
    name=MeasurementName.component_coverage.value,
    owner_id=commit.repository.ownerid,
    repo_id=commit.repoid,
    measurable_id="test-component-123",
    commit_sha=commit.commitid,
    timestamp=commit.timestamp,
    branch="testing",
    value=0,
    )
    dbsession.add(measurement)
    dbsession.flush()

    save_commit_measurements(commit)

    measurements = (
    dbsession.query(Measurement)
    .filter_by(
    name=MeasurementName.component_coverage.value,
    commit_sha=commit.commitid,
    timestamp=commit.timestamp,
    measurable_id="test-component-123",
    )
    .all()
    )

    assert len(measurements) == 1
    measurement = measurements[0]
    assert measurement.name == MeasurementName.component_coverage.value
    assert measurement.owner_id == commit.repository.ownerid
    assert measurement.repo_id == commit.repoid
    assert measurement.commit_sha == commit.commitid
    assert measurement.timestamp.replace(
    tzinfo=timezone.utc
    ) == commit.timestamp.replace(tzinfo=timezone.utc)
    > assert measurement.branch == "foo"
    E AssertionError: assert 'testing' == 'foo'
    E
    E - foo
    E + testing

    services/tests/test_timeseries.py:640: AssertionError
  • Class name: tasks.tests.unit.test_backfill_commit_data_to_storage_task.TestBackfillCommitDataToStorageTask
    Test name: test_all_report_rows

    self = <worker.tasks.tests.unit.test_backfill_commit_data_to_storage_task.TestBackfillCommitDataToStorageTask object at 0x7f5b8a3305c0>
    mock_handle_single_row = <MagicMock name='handle_single_report_row' id='140031108705328'>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>

    @patch(
    "tasks.backfill_commit_data_to_storage.BackfillCommitDataToStorageTask.handle_single_report_row"
    )
    def test_all_report_rows(self, mock_handle_single_row, dbsession):
    def mock_handle_single_row_return_side_effect(db_session, commit, report_row):
    if report_row.code is None:
    return {"success": True, "errors": []}
    if report_row.code == "local":
    return {"success": False, "errors": [BackfillError.missing_data.value]}

    mock_handle_single_row.side_effect = mock_handle_single_row_return_side_effect
    commit = CommitFactory()
    dbsession.add(commit)
    report_default = CommitReport(commit=commit, code=None)
    report_code = CommitReport(commit=commit, code="local")
    dbsession.add(report_default)
    dbsession.add(report_code)
    task = BackfillCommitDataToStorageTask()
    result = task.handle_all_report_rows(dbsession, commit)
    assert result == {"success": False, "errors": ["missing_data"]}
    > mock_handle_single_row.assert_has_calls(
    [
    call(dbsession, commit, report_default),
    call(dbsession, commit, report_code),
    ]
    )
    E AssertionError: Calls not found.
    E Expected: [call(<sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>, Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>, <database.models.reports.CommitReport object at 0x7f5b83fe58e0>),
    E call(<sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>, Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>, <database.models.reports.CommitReport object at 0x7f5b883777a0>)]
    E Actual: [call(<sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>, Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>, <database.models.reports.CommitReport object at 0x7f5b883777a0>),
    E call(<sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>, Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>, <database.models.reports.CommitReport object at 0x7f5b83fe58e0>)]
    E
    E pytest introspection follows:
    E
    E Args:
    E assert (<sqlalchemy....7f5b83fe58e0>) == ([call(<sqlal...b883777a0>)],)
    E
    E At index 0 diff: <sqlalchemy.orm.session.Session object at 0x7f5b83fe4950> != [call(<sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>, Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>, <database.models.reports.CommitReport object at 0x7f5b83fe58e0>), call(<sqlalchemy.orm.session.Session object at 0x7f5b83fe4950>, Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>, <database.models.reports.CommitReport object at 0x7f5b883777a0>)]
    E Left contains 2 more items, first extra item: Commit<d946f58be64301b0dae8bdf9028b1231e9290110@repo<680>>
    E Use -v to get more diff

    .../tests/unit/test_backfill_commit_data_to_storage_task.py:84: AssertionError
  • Class name: tasks.tests.unit.test_bundle_analysis_processor_task
    Test name: test_bundle_analysis_process_associate_called

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b6c534320>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2109, 'branch': None, 'ci_passed': True, 'commitid': 'c21cf35e297d59f86e5c11a59b32be67b3f0175d', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b83f83560>, [{'author': 2109, 'branch': None, 'ci_passed': True, 'commitid': 'c21cf35e297d59f86e5c11a59b32be67b3f0175d', ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b62713aa0>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b628a3bf0>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83904e50; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2109, 'branch': None, 'ci_passed': True, 'commitid': 'c21cf35e297d59f86e5c11a59b32be67b3f0175d', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b628a3bf0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.InvalidTextRepresentation: invalid input value for enum commit_state: "completed"
    E LINE 1: ...f35e297d59f86e5c11a59b32be67b3f0175d', NULL, 716, 'completed...
    E ^

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: InvalidTextRepresentation

    The above exception was the direct cause of the following exception:

    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b62710d10>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b62713c20>
    mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f5b881738f0>

    def test_bundle_analysis_process_associate_called(
    mocker,
    dbsession,
    mock_storage,
    ):
    storage_path = (
    ".../testing/ed1bdd67-8fd2-4cdb-ac9e-39b99e4a3892/bundle_report.sqlite"
    )
    mock_storage.write_file(get_bucket_name(), storage_path, "test-content")

    mocker.patch.object(
    BundleAnalysisProcessorTask,
    "app",
    tasks={
    bundle_analysis_save_measurements_task_name: mocker.MagicMock(),
    },
    )

    parent_commit = CommitFactory.create(state="completed")
    dbsession.add(parent_commit)
    > dbsession.flush()

    .../tests/unit/test_bundle_analysis_processor_task.py:620:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1083: in _emit_insert_statements
    c = cached_connections[connection].execute(statement, multiparams)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83904e50; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2109, 'branch': None, 'ci_passed': True, 'commitid': 'c21cf35e297d59f86e5c11a59b32be67b3f0175d', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b628a3bf0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input value for enum commit_state: "completed"
    E LINE 1: ...f35e297d59f86e5c11a59b32be67b3f0175d', NULL, 716, 'completed...
    E ^
    E
    E [SQL: INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, repoid, state, timestamp, updatestamp, totals, report, report_storage_path) VALUES (%(id)s, %(author)s, %(branch)s, %(ci_passed)s, %(commitid)s, %(deleted)s, %(message)s, %(notified)s, %(merged)s, %(parent)s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)]
    E [parameters: {'id': 1068, 'author': 2109, 'branch': None, 'ci_passed': True, 'commitid': 'c21cf35e297d59f86e5c11a59b32be67b3f0175d', 'deleted': None, 'message': 'Any step power.', 'notified': None, 'merged': None, 'parent': 'c21cf35e297d59f86e5c11a59b32be67b3f0175d', 'pullid': None, 'repoid': 716, 'state': 'completed', 'timestamp': datetime.datetime(2019, 2, 1, 17, 59, 47, tzinfo=datetime.timezone.utc), 'updatestamp': None, 'totals': '{"C": 0, "M": 0, "N": 0, "b": 0, "c": "85.00000", "d": 0, "diff": [1, 2, 1, 1, 0, "50.00000", 0, 0, 0, 0, 0, 0, 0], "f": 3, "h": 17, "m": 3, "n": 20, "p": 0, "s": 1}', 'report': '{"files": {"awesome/__init__.py": [2, [0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0], [[0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0]], [0, 2, ... (483 characters truncated) ... ull, "d": 1547084427, "e": null, "f": ["unit"], "j": null, "n": null, "p": null, "t": [3, 20, 17, 3, 0, "85.00000", 0, 0, 0, 0, 0, 0, 0], "": null}}}', 'report_storage_path': None}]
    E (Background on this error at: http://sqlalche..../e/13/9h9h)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: DataError
  • Class name: tasks.tests.unit.test_bundle_analysis_processor_task
    Test name: test_bundle_analysis_process_associate_called_two

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b83d6ee40>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2111, 'branch': None, 'ci_passed': True, 'commitid': '7bf9cfd073e0b1563a0b849bf9c982cf4fd91d7d', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b83f83560>, [{'author': 2111, 'branch': None, 'ci_passed': True, 'commitid': '7bf9cfd073e0b1563a0b849bf9c982cf4fd91d7d', ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b628da3c0>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b627132c0>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b839d48b0; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2111, 'branch': None, 'ci_passed': True, 'commitid': '7bf9cfd073e0b1563a0b849bf9c982cf4fd91d7d', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b627132c0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.InvalidTextRepresentation: invalid input value for enum commit_state: "completed"
    E LINE 1: ...cfd073e0b1563a0b849bf9c982cf4fd91d7d', NULL, 717, 'completed...
    E ^

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: InvalidTextRepresentation

    The above exception was the direct cause of the following exception:

    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b628da660>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b628da120>
    mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f5b83d017c0>

    @pytest.mark.django_db(databases={"default", "timeseries"})
    def test_bundle_analysis_process_associate_called_two(
    mocker,
    dbsession,
    mock_storage,
    ):
    storage_path = (
    ".../testing/ed1bdd67-8fd2-4cdb-ac9e-39b99e4a3892/bundle_report.sqlite"
    )
    mock_storage.write_file(get_bucket_name(), storage_path, "test-content")

    mocker.patch.object(
    BundleAnalysisProcessorTask,
    "app",
    tasks={
    bundle_analysis_save_measurements_task_name: mocker.MagicMock(),
    },
    )

    parent_commit = CommitFactory.create(state="completed")
    dbsession.add(parent_commit)
    > dbsession.flush()

    .../tests/unit/test_bundle_analysis_processor_task.py:684:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1083: in _emit_insert_statements
    c = cached_connections[connection].execute(statement, multiparams)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b839d48b0; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2111, 'branch': None, 'ci_passed': True, 'commitid': '7bf9cfd073e0b1563a0b849bf9c982cf4fd91d7d', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b627132c0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input value for enum commit_state: "completed"
    E LINE 1: ...cfd073e0b1563a0b849bf9c982cf4fd91d7d', NULL, 717, 'completed...
    E ^
    E
    E [SQL: INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, repoid, state, timestamp, updatestamp, totals, report, report_storage_path) VALUES (%(id)s, %(author)s, %(branch)s, %(ci_passed)s, %(commitid)s, %(deleted)s, %(message)s, %(notified)s, %(merged)s, %(parent)s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)]
    E [parameters: {'id': 1069, 'author': 2111, 'branch': None, 'ci_passed': True, 'commitid': '7bf9cfd073e0b1563a0b849bf9c982cf4fd91d7d', 'deleted': None, 'message': 'Interest manage state order staff.', 'notified': None, 'merged': None, 'parent': '7bf9cfd073e0b1563a0b849bf9c982cf4fd91d7d', 'pullid': None, 'repoid': 717, 'state': 'completed', 'timestamp': datetime.datetime(2019, 2, 1, 17, 59, 47, tzinfo=datetime.timezone.utc), 'updatestamp': None, 'totals': '{"C": 0, "M": 0, "N": 0, "b": 0, "c": "85.00000", "d": 0, "diff": [1, 2, 1, 1, 0, "50.00000", 0, 0, 0, 0, 0, 0, 0], "f": 3, "h": 17, "m": 3, "n": 20, "p": 0, "s": 1}', 'report': '{"files": {"awesome/__init__.py": [2, [0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0], [[0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0]], [0, 2, ... (483 characters truncated) ... ull, "d": 1547084427, "e": null, "f": ["unit"], "j": null, "n": null, "p": null, "t": [3, 20, 17, 3, 0, "85.00000", 0, 0, 0, 0, 0, 0, 0], "": null}}}', 'report_storage_path': None}]
    E (Background on this error at: http://sqlalche..../e/13/9h9h)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: DataError
  • Class name: tasks.tests.unit.test_bundle_analysis_processor_task
    Test name: test_bundle_analysis_process_associate_no_parent_commit_id

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b60f44620>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2103, 'branch': None, 'ci_passed': True, 'commitid': '806265b56cb2eeee2945318535828a82a28f06bc', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b83f83560>, [{'author': 2103, 'branch': None, 'ci_passed': True, 'commitid': '806265b56cb2eeee2945318535828a82a28f06bc', ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b62712db0>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b83b02150>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b887d4f40; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2103, 'branch': None, 'ci_passed': True, 'commitid': '806265b56cb2eeee2945318535828a82a28f06bc', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b83b02150>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.InvalidTextRepresentation: invalid input value for enum commit_state: "completed"
    E LINE 1: ...65b56cb2eeee2945318535828a82a28f06bc', NULL, 713, 'completed...
    E ^

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: InvalidTextRepresentation

    The above exception was the direct cause of the following exception:

    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b60f46960>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b83d36bd0>
    mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f5b83d35520>

    def test_bundle_analysis_process_associate_no_parent_commit_id(
    mocker,
    dbsession,
    mock_storage,
    ):
    storage_path = (
    ".../testing/ed1bdd67-8fd2-4cdb-ac9e-39b99e4a3892/bundle_report.sqlite"
    )
    mock_storage.write_file(get_bucket_name(), storage_path, "test-content")

    mocker.patch.object(
    BundleAnalysisProcessorTask,
    "app",
    tasks={
    bundle_analysis_save_measurements_task_name: mocker.MagicMock(),
    },
    )

    parent_commit = CommitFactory.create(state="completed")
    dbsession.add(parent_commit)
    > dbsession.flush()

    .../tests/unit/test_bundle_analysis_processor_task.py:457:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1083: in _emit_insert_statements
    c = cached_connections[connection].execute(statement, multiparams)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b887d4f40; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2103, 'branch': None, 'ci_passed': True, 'commitid': '806265b56cb2eeee2945318535828a82a28f06bc', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b83b02150>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input value for enum commit_state: "completed"
    E LINE 1: ...65b56cb2eeee2945318535828a82a28f06bc', NULL, 713, 'completed...
    E ^
    E
    E [SQL: INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, repoid, state, timestamp, updatestamp, totals, report, report_storage_path) VALUES (%(id)s, %(author)s, %(branch)s, %(ci_passed)s, %(commitid)s, %(deleted)s, %(message)s, %(notified)s, %(merged)s, %(parent)s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)]
    E [parameters: {'id': 1064, 'author': 2103, 'branch': None, 'ci_passed': True, 'commitid': '806265b56cb2eeee2945318535828a82a28f06bc', 'deleted': None, 'message': 'Wear all already sister.', 'notified': None, 'merged': None, 'parent': '806265b56cb2eeee2945318535828a82a28f06bc', 'pullid': None, 'repoid': 713, 'state': 'completed', 'timestamp': datetime.datetime(2019, 2, 1, 17, 59, 47, tzinfo=datetime.timezone.utc), 'updatestamp': None, 'totals': '{"C": 0, "M": 0, "N": 0, "b": 0, "c": "85.00000", "d": 0, "diff": [1, 2, 1, 1, 0, "50.00000", 0, 0, 0, 0, 0, 0, 0], "f": 3, "h": 17, "m": 3, "n": 20, "p": 0, "s": 1}', 'report': '{"files": {"awesome/__init__.py": [2, [0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0], [[0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0]], [0, 2, ... (483 characters truncated) ... ull, "d": 1547084427, "e": null, "f": ["unit"], "j": null, "n": null, "p": null, "t": [3, 20, 17, 3, 0, "85.00000", 0, 0, 0, 0, 0, 0, 0], "": null}}}', 'report_storage_path': None}]
    E (Background on this error at: http://sqlalche..../e/13/9h9h)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: DataError
  • Class name: tasks.tests.unit.test_bundle_analysis_processor_task
    Test name: test_bundle_analysis_process_associate_no_parent_commit_report_object

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b60f46300>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2107, 'branch': None, 'ci_passed': True, 'commitid': '40a192c28bd861177c2023873f70f7dc6421359c', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b83f83560>, [{'author': 2107, 'branch': None, 'ci_passed': True, 'commitid': '40a192c28bd861177c2023873f70f7dc6421359c', ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b83987890>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b628d8fb0>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83904b80; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2107, 'branch': None, 'ci_passed': True, 'commitid': '40a192c28bd861177c2023873f70f7dc6421359c', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b628d8fb0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.InvalidTextRepresentation: invalid input value for enum commit_state: "completed"
    E LINE 1: ...92c28bd861177c2023873f70f7dc6421359c', NULL, 715, 'completed...
    E ^

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: InvalidTextRepresentation

    The above exception was the direct cause of the following exception:

    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b62d8b110>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b839849b0>
    mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f5b62713980>

    def test_bundle_analysis_process_associate_no_parent_commit_report_object(
    mocker,
    dbsession,
    mock_storage,
    ):
    storage_path = (
    ".../testing/ed1bdd67-8fd2-4cdb-ac9e-39b99e4a3892/bundle_report.sqlite"
    )
    mock_storage.write_file(get_bucket_name(), storage_path, "test-content")

    mocker.patch.object(
    BundleAnalysisProcessorTask,
    "app",
    tasks={
    bundle_analysis_save_measurements_task_name: mocker.MagicMock(),
    },
    )

    parent_commit = CommitFactory.create(state="completed")
    dbsession.add(parent_commit)
    > dbsession.flush()

    .../tests/unit/test_bundle_analysis_processor_task.py:563:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1083: in _emit_insert_statements
    c = cached_connections[connection].execute(statement, multiparams)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83904b80; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2107, 'branch': None, 'ci_passed': True, 'commitid': '40a192c28bd861177c2023873f70f7dc6421359c', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b628d8fb0>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input value for enum commit_state: "completed"
    E LINE 1: ...92c28bd861177c2023873f70f7dc6421359c', NULL, 715, 'completed...
    E ^
    E
    E [SQL: INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, repoid, state, timestamp, updatestamp, totals, report, report_storage_path) VALUES (%(id)s, %(author)s, %(branch)s, %(ci_passed)s, %(commitid)s, %(deleted)s, %(message)s, %(notified)s, %(merged)s, %(parent)s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)]
    E [parameters: {'id': 1067, 'author': 2107, 'branch': None, 'ci_passed': True, 'commitid': '40a192c28bd861177c2023873f70f7dc6421359c', 'deleted': None, 'message': 'Across culture tonight above become couple.', 'notified': None, 'merged': None, 'parent': '40a192c28bd861177c2023873f70f7dc6421359c', 'pullid': None, 'repoid': 715, 'state': 'completed', 'timestamp': datetime.datetime(2019, 2, 1, 17, 59, 47, tzinfo=datetime.timezone.utc), 'updatestamp': None, 'totals': '{"C": 0, "M": 0, "N": 0, "b": 0, "c": "85.00000", "d": 0, "diff": [1, 2, 1, 1, 0, "50.00000", 0, 0, 0, 0, 0, 0, 0], "f": 3, "h": 17, "m": 3, "n": 20, "p": 0, "s": 1}', 'report': '{"files": {"awesome/__init__.py": [2, [0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0], [[0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0]], [0, 2, ... (483 characters truncated) ... ull, "d": 1547084427, "e": null, "f": ["unit"], "j": null, "n": null, "p": null, "t": [3, 20, 17, 3, 0, "85.00000", 0, 0, 0, 0, 0, 0, 0], "": null}}}', 'report_storage_path': None}]
    E (Background on this error at: http://sqlalche..../e/13/9h9h)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: DataError
  • Class name: tasks.tests.unit.test_bundle_analysis_processor_task
    Test name: test_bundle_analysis_processor_associate_custom_compare_sha

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b62d8af60>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2113, 'branch': None, 'ci_passed': True, 'commitid': '0a4f70b174abaf3ca7c73e14bbd25699c48f8248', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b83f83560>, [{'author': 2113, 'branch': None, 'ci_passed': True, 'commitid': '0a4f70b174abaf3ca7c73e14bbd25699c48f8248', ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b62711700>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b79b4c890>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83e13790; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2113, 'branch': None, 'ci_passed': True, 'commitid': '0a4f70b174abaf3ca7c73e14bbd25699c48f8248', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b79b4c890>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.InvalidTextRepresentation: invalid input value for enum commit_state: "completed"
    E LINE 1: ...70b174abaf3ca7c73e14bbd25699c48f8248', NULL, 718, 'completed...
    E ^

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: InvalidTextRepresentation

    The above exception was the direct cause of the following exception:

    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b62710920>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b62712540>
    mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f5b6c535ac0>

    @pytest.mark.django_db(databases={"default", "timeseries"})
    def test_bundle_analysis_processor_associate_custom_compare_sha(
    mocker,
    dbsession,
    mock_storage,
    ):
    storage_path = (
    ".../testing/ed1bdd67-8fd2-4cdb-ac9e-39b99e4a3892/bundle_report.sqlite"
    )
    mock_storage.write_file(get_bucket_name(), storage_path, "test-content")

    mocker.patch.object(
    BundleAnalysisProcessorTask,
    "app",
    tasks={
    bundle_analysis_save_measurements_task_name: mocker.MagicMock(),
    },
    )

    parent_commit = CommitFactory.create(state="completed")
    dbsession.add(parent_commit)
    > dbsession.flush()

    .../tests/unit/test_bundle_analysis_processor_task.py:759:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1083: in _emit_insert_statements
    c = cached_connections[connection].execute(statement, multiparams)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83e13790; closed: -1>
    statement = 'INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, rep...s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)'
    parameters = {'author': 2113, 'branch': None, 'ci_passed': True, 'commitid': '0a4f70b174abaf3ca7c73e14bbd25699c48f8248', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b79b4c890>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input value for enum commit_state: "completed"
    E LINE 1: ...70b174abaf3ca7c73e14bbd25699c48f8248', NULL, 718, 'completed...
    E ^
    E
    E [SQL: INSERT INTO commits (id, author, branch, ci_passed, commitid, deleted, message, notified, merged, parent, pullid, repoid, state, timestamp, updatestamp, totals, report, report_storage_path) VALUES (%(id)s, %(author)s, %(branch)s, %(ci_passed)s, %(commitid)s, %(deleted)s, %(message)s, %(notified)s, %(merged)s, %(parent)s, %(pullid)s, %(repoid)s, %(state)s, %(timestamp)s, %(updatestamp)s, %(totals)s, %(report)s, %(report_storage_path)s)]
    E [parameters: {'id': 1070, 'author': 2113, 'branch': None, 'ci_passed': True, 'commitid': '0a4f70b174abaf3ca7c73e14bbd25699c48f8248', 'deleted': None, 'message': 'Support save worker officer matter.', 'notified': None, 'merged': None, 'parent': '0a4f70b174abaf3ca7c73e14bbd25699c48f8248', 'pullid': None, 'repoid': 718, 'state': 'completed', 'timestamp': datetime.datetime(2019, 2, 1, 17, 59, 47, tzinfo=datetime.timezone.utc), 'updatestamp': None, 'totals': '{"C": 0, "M": 0, "N": 0, "b": 0, "c": "85.00000", "d": 0, "diff": [1, 2, 1, 1, 0, "50.00000", 0, 0, 0, 0, 0, 0, 0], "f": 3, "h": 17, "m": 3, "n": 20, "p": 0, "s": 1}', 'report': '{"files": {"awesome/__init__.py": [2, [0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0], [[0, 10, 8, 2, 0, "80.00000", 0, 0, 0, 0, 0, 0, 0]], [0, 2, ... (483 characters truncated) ... ull, "d": 1547084427, "e": null, "f": ["unit"], "j": null, "n": null, "p": null, "t": [3, 20, 17, 3, 0, "85.00000", 0, 0, 0, 0, 0, 0, 0], "": null}}}', 'report_storage_path': None}]
    E (Background on this error at: http://sqlalche..../e/13/9h9h)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: DataError
  • Class name: tasks.tests.unit.test_ghm_sync_plans.TestGHMarketplaceSyncPlansTaskUnit
    Test name: test_create_or_update_plan_known_user_with_plan

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b624fabd0>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO owners (service, service_id, name, email, username, plan_activated_users, admins, permission, organizatio...n_user_count)s, %(stripe_customer_id)s, %(stripe_subscription_id)s, %(onboarding_completed)s) RETURNING owners.ownerid'
    parameters = {'admins': [], 'email': '[email protected]', 'free': 0, 'name': 'Blake Russo', ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b624fbfe0>, [{'admins': [], 'email': '[email protected]', 'free': 0, 'name': 'Blake Russo', ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b88173e90>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b624f8980>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83907b50; closed: -1>
    statement = 'INSERT INTO owners (service, service_id, name, email, username, plan_activated_users, admins, permission, organizatio...n_user_count)s, %(stripe_customer_id)s, %(stripe_subscription_id)s, %(onboarding_completed)s) RETURNING owners.ownerid'
    parameters = {'admins': [], 'email': '[email protected]', 'free': 0, 'name': 'Blake Russo', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b624f8980>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.InvalidTextRepresentation: invalid input value for enum plans: "some-plan"
    E LINE 1: ...09-20T03:09:30.243969'::timestamp, 'not_started', 'some-plan...
    E ^

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: InvalidTextRepresentation

    The above exception was the direct cause of the following exception:

    self = <worker.tasks.tests.unit.test_ghm_sync_plans.TestGHMarketplaceSyncPlansTaskUnit object at 0x7f5b8a38af90>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b881728d0>
    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b628a1490>

    def test_create_or_update_plan_known_user_with_plan(self, dbsession, mocker):
    owner = OwnerFactory.create(
    service="github",
    plan="some-plan",
    plan_user_count=10,
    plan_activated_users=[34123, 231, 2314212],
    stripe_customer_id="cus_123",
    stripe_subscription_id="sub_123",
    )
    dbsession.add(owner)
    repo = RepositoryFactory.create(
    private=True, service_id="12071992", activated=True, owner=owner
    )
    dbsession.add(repo)
    > dbsession.flush()

    .../tests/unit/test_ghm_sync_plans.py:88:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1135: in _emit_insert_statements
    result = cached_connections[connection].execute(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b83907b50; closed: -1>
    statement = 'INSERT INTO owners (service, service_id, name, email, username, plan_activated_users, admins, permission, organizatio...n_user_count)s, %(stripe_customer_id)s, %(stripe_subscription_id)s, %(onboarding_completed)s) RETURNING owners.ownerid'
    parameters = {'admins': [], 'email': '[email protected]', 'free': 0, 'name': 'Blake Russo', ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b624f8980>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input value for enum plans: "some-plan"
    E LINE 1: ...09-20T03:09:30.243969'::timestamp, 'not_started', 'some-plan...
    E ^
    E
    E [SQL: INSERT INTO owners (service, service_id, name, email, username, plan_activated_users, admins, permission, organizations, free, oauth_token, trial_start_date, trial_end_date, trial_status, plan, plan_user_count, stripe_customer_id, stripe_subscription_id, onboarding_completed) VALUES (%(service)s, %(service_id)s, %(name)s, %(email)s, %(username)s, %(plan_activated_users)s, %(admins)s, %(permission)s, %(organizations)s, %(free)s, %(oauth_token)s, %(trial_start_date)s, %(trial_end_date)s, %(trial_status)s, %(plan)s, %(plan_user_count)s, %(stripe_customer_id)s, %(stripe_subscription_id)s, %(onboarding_completed)s) RETURNING owners.ownerid]
    E [parameters: {'service': 'github', 'service_id': 'user2391', 'name': 'Blake Russo', 'email': '[email protected]', 'username': 'scottamanda', 'plan_activated_users': [34123, 231, 2314212], 'admins': [], 'permission': [], 'organizations': [], 'free': 0, 'oauth_token': b'mOayneF1wlCkwjVl1+f6f1dChMBo6v1VLqeayZJpFeLaH/lMRm0iyWvCVv8ksD0bECRc8hFipk8fchzQ4MoG/g==', 'trial_start_date': datetime.datetime(2024, 9, 20, 3, 9, 30, 243965), 'trial_end_date': datetime.datetime(2024, 9, 20, 3, 9, 30, 243969), 'trial_status': 'not_started', 'plan': 'some-plan', 'plan_user_count': 10, 'stripe_customer_id': 'cus_123', 'stripe_subscription_id': 'sub_123', 'onboarding_completed': False}]
    E (Background on this error at: http://sqlalche..../e/13/9h9h)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: DataError
  • Class name: tasks.tests.unit.test_ghm_sync_plans.TestGHMarketplaceSyncPlansTaskUnit
    Test name: test_create_or_update_to_free_plan_known_user

    self = <worker.tasks.tests.unit.test_ghm_sync_plans.TestGHMarketplaceSyncPlansTaskUnit object at 0x7f5b8a38ba70>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b83748650>
    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b881703e0>

    def test_create_or_update_to_free_plan_known_user(self, dbsession, mocker):
    owner = OwnerFactory.create(
    service="github",
    plan="users",
    plan_user_count=2,
    plan_activated_users=[1, 2],
    )
    dbsession.add(owner)
    repo = RepositoryFactory.create(
    private=True, service_id="12071992", activated=True, owner=owner
    )
    dbsession.add(repo)
    dbsession.flush()

    ghm_service = mocker.MagicMock(get_user=mocker.MagicMock())
    SyncPlansTask().create_or_update_to_free_plan(
    dbsession, ghm_service, owner.service_id
    )

    assert not ghm_service.get_user.called
    assert owner.plan == BillingPlan.users_basic.value
    assert owner.plan_user_count == 1
    assert owner.plan_activated_users is None
    # Owner was already created, we don't update this value
    > assert owner.createstamp is None
    E assert datetime.datetime(2024, 9, 20, 3, 10, 45, 486784, tzinfo=datetime.timezone.utc) is None
    E + where datetime.datetime(2024, 9, 20, 3, 10, 45, 486784, tzinfo=datetime.timezone.utc) = Owner<2306@service<github>>.createstamp

    .../tests/unit/test_ghm_sync_plans.py:34: AssertionError
  • Class name: tasks.tests.unit.test_ghm_sync_plans.TestGHMarketplaceSyncPlansTaskUnit
    Test name: test_create_or_update_to_free_plan_unknown_user

    self = <worker.tasks.tests.unit.test_ghm_sync_plans.TestGHMarketplaceSyncPlansTaskUnit object at 0x7f5b8a38b4a0>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b83a6af90>
    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b627eab40>

    @freeze_time("2024-03-28T00:00:00")
    def test_create_or_update_to_free_plan_unknown_user(self, dbsession, mocker):
    service_id = "12345"
    username = "tomcat"
    name = "Tom Cat"
    email = "[email protected]"
    ghm_service = mocker.MagicMock(
    get_user=mocker.MagicMock(
    return_value=dict(login=username, name=name, email=email)
    )
    )
    SyncPlansTask().create_or_update_to_free_plan(
    dbsession, ghm_service, service_id
    )

    assert ghm_service.get_user.called

    owner = (
    dbsession.query(Owner)
    .filter(Owner.service_id == service_id, Owner.service == "github")
    .first()
    )
    assert owner.username == username
    assert owner.name == name
    assert owner.email == email
    > assert owner.createstamp.isoformat() == "2024-03-28T00:00:00"
    E AssertionError: assert '2024-03-28T00:00:00+00:00' == '2024-03-28T00:00:00'
    E
    E - 2024-03-28T00:00:00
    E + 2024-03-28T00:00:00+00:00
    E & ++++++

    .../tests/unit/test_ghm_sync_plans.py:72: AssertionError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_generate_flake_dict

    No failure message available
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_get_test_instances_when_instance_is_failure

    No failure message available
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_get_test_instances_when_instance_is_pass

    No failure message available
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_get_test_instances_when_test_is_flaky

    No failure message available
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_get_test_instances_when_test_is_flaky_and_instance_is_skip

    No failure message available
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_creates_flakes_expires

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b5645fd10>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('d196158a-a839-488f-80b4-8a1e4d89fb1e'), datetime.datetime(2024, 9, 20, 3, 11, 37, 617969, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 11, 37, 617969, tzinfo=datetime.timezone.utc), 1154, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b5645fd10>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    def test_it_creates_flakes_expires(transactional_db):
    with time_machine.travel(dt.datetime.now(tz=dt.UTC), tick=False) as traveller:
    rs = RepoSimulator()
    commits = []
    c1 = rs.create_commit()
    rs.merge(c1)
    > rs.add_test_instance(c1, outcome=TestInstance.Outcome.FAILURE.value)

    .../tests/unit/test_process_flakes.py:352:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b5645fd10>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('d196158a-a839-488f-80b4-8a1e4d89fb1e'), datetime.datetime(2024, 9, 20, 3, 11, 37, 617969, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 11, 37, 617969, tzinfo=datetime.timezone.utc), 1154, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b5645fd10>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_creates_flakes_fail_after_merge

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b565662d0>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('5d4f3657-35fe-4981-b3cb-c466053b07bc'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773106, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773106, tzinfo=datetime.timezone.utc), 1148, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b565662d0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    @time_machine.travel(dt.datetime.now(tz=dt.UTC), tick=False)
    def test_it_creates_flakes_fail_after_merge(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()

    > rs.add_test_instance(c1, outcome=TestInstance.Outcome.FAILURE.value)

    .../tests/unit/test_process_flakes.py:259:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b565662d0>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('5d4f3657-35fe-4981-b3cb-c466053b07bc'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773106, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773106, tzinfo=datetime.timezone.utc), 1148, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b565662d0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_creates_flakes_from_new_branch_only

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b56498e00>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('793d51ed-8361-43cf-b8d0-e47db6ee3797'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773086, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773086, tzinfo=datetime.timezone.utc), 1146, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b56498e00>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    @time_machine.travel(dt.datetime.now(tz=dt.UTC), tick=False)
    def test_it_creates_flakes_from_new_branch_only(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()
    orig_branch = c1.branch
    > rs.add_test_instance(c1)

    .../tests/unit/test_process_flakes.py:239:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b56498e00>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('793d51ed-8361-43cf-b8d0-e47db6ee3797'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773086, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773086, tzinfo=datetime.timezone.utc), 1146, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b56498e00>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_creates_flakes_from_orig_branch

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b564af290>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('e4f23470-c5a7-4d54-84e9-bbed8f842019'), datetime.datetime(2024, 9, 20, 3, 9, 35, 772969, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 772969, tzinfo=datetime.timezone.utc), 1144, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b564af290>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    @time_machine.travel(dt.datetime.now(tz=dt.UTC), tick=False)
    def test_it_creates_flakes_from_orig_branch(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()
    orig_branch = c1.branch
    > rs.add_test_instance(c1)

    .../tests/unit/test_process_flakes.py:219:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b564af290>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('e4f23470-c5a7-4d54-84e9-bbed8f842019'), datetime.datetime(2024, 9, 20, 3, 9, 35, 772969, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 772969, tzinfo=datetime.timezone.utc), 1144, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b564af290>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_does_not_detect_unmerged_tests

    transactional_db = None

    def test_it_does_not_detect_unmerged_tests(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()

    rs.add_test_instance(c1)

    rs.add_test_instance(c1)

    ProcessFlakesTask().run_impl(
    None, repo_id=rs.repo.repoid, commit_id_list=[c1.commitid], branch=c1.branch
    )

    > assert len(Flake.objects.all()) == 0
    E assert 3 == 0
    E + where 3 = len(<QuerySet [<Flake: Flake object (1)>, <Flake: Flake object (2)>, <Flake: Flake object (3)>]>)
    E + where <QuerySet [<Flake: Flake object (1)>, <Flake: Flake object (2)>, <Flake: Flake object (3)>]> = <bound method BaseManager.all of <django.db.models.manager.Manager object at 0x7f5b564264b0>>()
    E + where <bound method BaseManager.all of <django.db.models.manager.Manager object at 0x7f5b564264b0>> = <django.db.models.manager.Manager object at 0x7f5b564264b0>.all
    E + where <django.db.models.manager.Manager object at 0x7f5b564264b0> = Flake.objects

    .../tests/unit/test_process_flakes.py:194: AssertionError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_handles_only_passes

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b56516c90>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('2e573064-06f4-4ab9-9928-e7c2555edfac'), datetime.datetime(2024, 9, 20, 3, 11, 35, 656635, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 11, 35, 656642, tzinfo=datetime.timezone.utc), 1142, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b56516c90>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    def test_it_handles_only_passes(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()

    > rs.add_test_instance(c1)

    .../tests/unit/test_process_flakes.py:201:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b56516c90>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('2e573064-06f4-4ab9-9928-e7c2555edfac'), datetime.datetime(2024, 9, 20, 3, 11, 35, 656635, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 11, 35, 656642, tzinfo=datetime.timezone.utc), 1142, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b56516c90>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_processes_two_commits_separately

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b56424dd0>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('de9f7ba9-5c3d-452e-a1cf-1c4bd748e04d'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773136, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773136, tzinfo=datetime.timezone.utc), 1152, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b56424dd0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    @time_machine.travel(dt.datetime.now(tz=dt.UTC), tick=False)
    def test_it_processes_two_commits_separately(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()
    rs.merge(c1)
    > rs.add_test_instance(c1, outcome=TestInstance.Outcome.FAILURE.value)

    .../tests/unit/test_process_flakes.py:318:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b56424dd0>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('de9f7ba9-5c3d-452e-a1cf-1c4bd748e04d'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773136, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773136, tzinfo=datetime.timezone.utc), 1152, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b56424dd0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_it_processes_two_commits_together

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b565c0860>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('65b77350-365f-4e7a-8b7b-e602b5583873'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773121, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773121, tzinfo=datetime.timezone.utc), 1150, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b565c0860>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: UniqueViolation

    The above exception was the direct cause of the following exception:

    transactional_db = None

    @time_machine.travel(dt.datetime.now(tz=dt.UTC), tick=False)
    def test_it_processes_two_commits_together(transactional_db):
    rs = RepoSimulator()
    c1 = rs.create_commit()
    rs.merge(c1)
    > rs.add_test_instance(c1, outcome=TestInstance.Outcome.FAILURE.value)

    .../tests/unit/test_process_flakes.py:292:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../tests/unit/test_process_flakes.py:52: in add_test_instance
    test=self.test_map[self.test_count],
    .../tests/unit/test_process_flakes.py:29: in <lambda>
    self.test_map = defaultdict(lambda: TestFactory(id=self.test_count))
    .../local/lib/python3.12............/site-packages/factory/base.py:40: in __call__
    return cls.create(**kwargs)
    .../local/lib/python3.12............/site-packages/factory/base.py:528: in create
    return cls._generate(enums.CREATE_STRATEGY, kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:117: in _generate
    return super()._generate(strategy, params)
    .../local/lib/python3.12............/site-packages/factory/base.py:465: in _generate
    return step.build()
    .../local/lib/python3.12.../site-packages/factory/builder.py:262: in build
    instance = self.factory_meta.instantiate(
    .../local/lib/python3.12............/site-packages/factory/base.py:317: in instantiate
    return self.factory._create(model, *args, **kwargs)
    .../local/lib/python3.12....../site-packages/factory/django.py:166: in _create
    return manager.create(*args, **kwargs)
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:658: in create
    obj.save(force_insert=True, using=self.db)
    .../local/lib/python3.12.../db/models/base.py:814: in save
    self.save_base(
    .../local/lib/python3.12.../db/models/base.py:877: in save_base
    updated = self._save_table(
    .../local/lib/python3.12.../db/models/base.py:1020: in _save_table
    results = self._do_insert(
    .../local/lib/python3.12.../db/models/base.py:1061: in _do_insert
    return manager._insert(
    .../local/lib/python3.12.../db/models/manager.py:87: in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
    .../local/lib/python3.12.../db/models/query.py:1805: in _insert
    return query.get_compiler(using=using).execute_sql(returning_fields)
    .../local/lib/python3.12.../models/sql/compiler.py:1822: in execute_sql
    cursor.execute(sql, params)
    .../local/lib/python3.12.../db/backends/utils.py:67: in execute
    return self._execute_with_wrappers(
    .../local/lib/python3.12.../db/backends/utils.py:80: in _execute_with_wrappers
    return executor(sql, params, many, context)
    .../local/lib/python3.12.../db/backends/utils.py:84: in _execute
    with self.db.wrap_database_errors:
    .../local/lib/python3.12.../django/db/utils.py:91: in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <django.db.backends.utils.CursorWrapper object at 0x7f5b565c0860>
    sql = 'INSERT INTO "reports_test" ("id", "external_id", "created_at", "updated_at", "repoid", "name", "testsuite", "flags_hash", "failure_rate", "commits_where_fail") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s::text[])'
    params = ('0', UUID('65b77350-365f-4e7a-8b7b-e602b5583873'), datetime.datetime(2024, 9, 20, 3, 9, 35, 773121, tzinfo=datetime.timezone.utc), datetime.datetime(2024, 9, 20, 3, 9, 35, 773121, tzinfo=datetime.timezone.utc), 1150, '', ...)
    ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f5b565c0860>})

    def _execute(self, sql, params, *ignored_wrapper_args):
    self.db.validate_no_broken_transaction()
    with self.db.wrap_database_errors:
    if params is None:
    # params default might be backend specific.
    return self.cursor.execute(sql)
    else:
    > return self.cursor.execute(sql, params)
    E django.db.utils.IntegrityError: duplicate key value violates unique constraint "reports_test_pkey"
    E DETAIL: Key (id)=(0) already exists.

    .../local/lib/python3.12.../db/backends/utils.py:89: IntegrityError
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_update_passed_flakes

    No failure message available
  • Class name: tasks.tests.unit.test_process_flakes
    Test name: test_upsert_failed_flakes

    No failure message available
  • Class name: tasks.tests.unit.test_upload_finisher_task.TestUploadFinisherTask
    Test name: test_finish_reports_processing_with_pull

    self = <sqlalchemy.engine.base.Connection object at 0x7f5b60219580>
    dialect = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    constructor = <bound method DefaultExecutionContext._init_compiled of <class 'sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2'>>
    statement = 'INSERT INTO pulls (repoid, pullid, issueid, updatestamp, state, title, base, user_provided_base_sha, compared_to, hea...ndle_analysis_commentid)s, %(author)s, %(behind_by)s, %(behind_by_commit)s, %(flare_storage_path)s) RETURNING pulls.id'
    parameters = {'author': 2798, 'base': None, 'behind_by': None, 'behind_by_commit': None, ...}
    args = (<sqlalchemy.dialects.postgresql.psycopg2.PGCompiler_psycopg2 object at 0x7f5b880e34a0>, [{'author': 2798, 'base': None, 'behind_by': None, 'behind_by_commit': None, ...}])
    conn = <sqlalchemy.pool.base._ConnectionFairy object at 0x7f5b602ed460>
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b60824440>

    def _execute_context(
    self, dialect, constructor, statement, parameters, *args
    ):
    """Create an :class:`.ExecutionContext` and execute, returning
    a :class:`_engine.ResultProxy`.

    """

    try:
    try:
    conn = self.__connection
    except AttributeError:
    # escape "except AttributeError" before revalidating
    # to prevent misleading stacktraces in Py3K
    conn = None
    if conn is None:
    conn = self._revalidate_connection()

    context = constructor(dialect, self, conn, *args)
    except BaseException as e:
    self._handle_dbapi_exception(
    e, util.text_type(statement), parameters, None, None
    )

    if context.compiled:
    context.pre_exec()

    cursor, statement, parameters = (
    context.cursor,
    context.statement,
    context.parameters,
    )

    if not context.executemany:
    parameters = parameters[0]

    if self._has_events or self.engine._has_events:
    for fn in self.dispatch.before_cursor_execute:
    statement, parameters = fn(
    self,
    cursor,
    statement,
    parameters,
    context,
    context.executemany,
    )

    if self._echo:
    self.engine.logger.info(statement)
    if not self.engine.hide_parameters:
    self.engine.logger.info(
    "%r",
    sql_util._repr_params(
    parameters, batches=10, ismulti=context.executemany
    ),
    )
    else:
    self.engine.logger.info(
    "[SQL parameters hidden due to hide_parameters=True]"
    )

    evt_handled = False
    try:
    if context.executemany:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_executemany:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_executemany(
    cursor, statement, parameters, context
    )
    elif not parameters and context.no_parameters:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute_no_params:
    if fn(cursor, statement, context):
    evt_handled = True
    break
    if not evt_handled:
    self.dialect.do_execute_no_params(
    cursor, statement, context
    )
    else:
    if self.dialect._has_events:
    for fn in self.dialect.dispatch.do_execute:
    if fn(cursor, statement, parameters, context):
    evt_handled = True
    break
    if not evt_handled:
    > self.dialect.do_execute(
    cursor, statement, parameters, context
    )

    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b887eff10; closed: -1>
    statement = 'INSERT INTO pulls (repoid, pullid, issueid, updatestamp, state, title, base, user_provided_base_sha, compared_to, hea...ndle_analysis_commentid)s, %(author)s, %(behind_by)s, %(behind_by_commit)s, %(flare_storage_path)s) RETURNING pulls.id'
    parameters = {'author': 2798, 'base': None, 'behind_by': None, 'behind_by_commit': None, ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b60824440>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "pulls_repoid_pullid"
    E DETAIL: Key (repoid, pullid)=(1055, 77) already exists.

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: UniqueViolation

    The above exception was the direct cause of the following exception:

    self = <worker.tasks.tests.unit.test_upload_finisher_task.TestUploadFinisherTask object at 0x7f5b89e8c920>
    dbsession = <sqlalchemy.orm.session.Session object at 0x7f5b602ef260>
    mocker = <pytest_mock.plugin.MockFixture object at 0x7f5b602eccb0>

    def test_finish_reports_processing_with_pull(self, dbsession, mocker):
    commit_yaml = {}
    mocked_app = mocker.patch.object(
    UploadFinisherTask,
    "app",
    tasks={
    "app.tasks.notify.Notify": mocker.MagicMock(),
    "app.tasks.pulls.Sync": mocker.MagicMock(),
    "app.tasks.compute_comparison.ComputeComparison": mocker.MagicMock(),
    "app.tasks.upload.UploadCleanLabelsIndex": mocker.MagicMock(),
    },
    )
    repository = RepositoryFactory.create(
    owner__unencrypted_oauth_token="testulk3d54rlhxkjyzomq2wh8b7np47xabcrkx8",
    owner__username="ThiagoCodecov",
    yaml=commit_yaml,
    )
    dbsession.add(repository)
    dbsession.flush()
    pull = PullFactory.create(repository=repository)
    compared_to = CommitFactory.create(repository=repository)
    pull.compared_to = compared_to.commitid
    commit = CommitFactory.create(
    message="dsidsahdsahdsa",
    commitid="abf6d4df662c47e32460020ab14abf9303581429",
    repository=repository,
    pullid=pull.pullid,
    )
    processing_results = {"processings_so_far": [{"successful": True}]}
    dbsession.add(commit)
    dbsession.add(compared_to)
    dbsession.add(pull)
    > dbsession.flush()

    .../tests/unit/test_upload_finisher_task.py:452:
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2540: in flush
    self._flush(objects)
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2681: in _flush
    with util.safe_reraise():
    .../local/lib/python3.12.../sqlalchemy/util/langhelpers.py:68: in __exit__
    compat.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/orm/session.py:2642: in _flush
    flush_context.execute()
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:422: in execute
    rec.execute(self)
    .../local/lib/python3.12.../sqlalchemy/orm/unitofwork.py:586: in execute
    persistence.save_obj(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:239: in save_obj
    _emit_insert_statements(
    .../local/lib/python3.12.../sqlalchemy/orm/persistence.py:1135: in _emit_insert_statements
    result = cached_connections[connection].execute(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1011: in execute
    return meth(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/sql/elements.py:298: in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1124: in _execute_clauseelement
    ret = self._execute_context(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1316: in _execute_context
    self._handle_dbapi_exception(
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1510: in _handle_dbapi_exception
    util.raise_(
    .../local/lib/python3.12.../sqlalchemy/util/compat.py:182: in raise_
    raise exception
    .../local/lib/python3.12.../sqlalchemy/engine/base.py:1276: in _execute_context
    self.dialect.do_execute(
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

    self = <sqlalchemy.dialects.postgresql.psycopg2.PGDialect_psycopg2 object at 0x7f5b8a64deb0>
    cursor = <cursor object at 0x7f5b887eff10; closed: -1>
    statement = 'INSERT INTO pulls (repoid, pullid, issueid, updatestamp, state, title, base, user_provided_base_sha, compared_to, hea...ndle_analysis_commentid)s, %(author)s, %(behind_by)s, %(behind_by_commit)s, %(flare_storage_path)s) RETURNING pulls.id'
    parameters = {'author': 2798, 'base': None, 'behind_by': None, 'behind_by_commit': None, ...}
    context = <sqlalchemy.dialects.postgresql.psycopg2.PGExecutionContext_psycopg2 object at 0x7f5b60824440>

    def do_execute(self, cursor, statement, parameters, context=None):
    > cursor.execute(statement, parameters)
    E sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "pulls_repoid_pullid"
    E DETAIL: Key (repoid, pullid)=(1055, 77) already exists.
    E
    E [SQL: INSERT INTO pulls (repoid, pullid, issueid, updatestamp, state, title, base, user_provided_base_sha, compared_to, head, commentid, bundle_analysis_commentid, author, behind_by, behind_by_commit, flare_storage_path) VALUES (%(repoid)s, %(pullid)s, %(issueid)s, %(updatestamp)s, %(state)s, %(title)s, %(base)s, %(user_provided_base_sha)s, %(compared_to)s, %(head)s, %(commentid)s, %(bundle_analysis_commentid)s, %(author)s, %(behind_by)s, %(behind_by_commit)s, %(flare_storage_path)s) RETURNING pulls.id]
    E [parameters: {'repoid': 1055, 'pullid': 77, 'issueid': None, 'updatestamp': datetime.datetime(2024, 9, 20, 3, 11, 6, 546025), 'state': 'open', 'title': None, 'base': None, 'user_provided_base_sha': None, 'compared_to': 'fe583a66f8d30ec7a4e24b9d5e9dc7513a95cbb0', 'head': None, 'commentid': None, 'bundle_analysis_commentid': None, 'author': 2798, 'behind_by': None, 'behind_by_commit': None, 'flare_storage_path': None}]
    E (Background on this error at: http://sqlalche..../e/13/gkpj)

    .../local/lib/python3.12.../sqlalchemy/engine/default.py:608: IntegrityError

* null constraints on installation_ids
* default yaml for owners created after a certain date
The owner.createstamp is now defaulting to now, which breaks all the tests
that assumes that we are before the patch centric YAML selection. Now force the
createstamp of the owner
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant