Skip to content

immune-gmbh/attestation-sdk

attestation-sdk

License

About

attestation-sdk demonstrates an AttestationFailureAnalysisService -- an extendible (dependency-injection focused) service tailored to provide machine readable reports, explaining why Remote Attestation failed.

The initial purpose of this service is to automatically handle the red block in this diagram: attestation_overall.png

This service is not a production-ready magic solution of all of your problems with attestation, but rather a generic template reusable across different companies and attestation/provisioning flows. But the specifics of your attestation/provisioning flow requires an implementation.

Use cases

Remediation

Let's assume the successful attestation flow looks something like this: attestation_success.png

In this case attestation failure would be something like this: attestation_failure_without_afas.png

So any failure for any reason would require manual intervention to understand what exactly happened, does it require security escalation (and escalate if required), how to fix it, fix it and restart the process.

But with AttestationFailureAnalysisService the flow becomes something like this: attestation_failure_with_afas.png

Thus it enables to automate handling of the attestation failures.

How to use this project

-- AttestationFailureAnalysisService --

To use this project you need to implement yourown analyzers (see examples in pkg/analyzers), and reimplement pkg/server/controller and cmd/afasd. The implementations of controller and cmd/afasd in this repository are just a hello-world example.

As part of pkg/server/controller re-implementation you might also want to re-implement interfaces:

type DeviceGetter interface {
    GetDeviceByHostname(hostname string) (*device.Device, error)
    GetDeviceByAssetID(assetID int64) (*device.Device, error)
    // you may also re-define if/device.thrift
}

type originalFWImageRepository interface {
    DownloadByVersion(ctx context.Context, version string) ([]byte, string, error)
}

And as part of cmd/afasd re-implementation you might also want to re-implement interface:

type BlobStorage interface {
    io.Closer
    Get(ctx context.Context, key []byte) ([]byte, error)
    Replace(ctx context.Context, key []byte, blob []byte) error
    Delete(ctx context.Context, key []byte) error
}

In result you will have yourown implementation of attestation failure analysis service, which is tailored to your attestation/provisioning flows. But the generic logic (like a generic API for analyzers) will be shared with other companies. You may also share specific analyzers with the public (similar to how some analyzers are published here).

So overall you just copy the hello-world implementations provided here and start gradually change them in your repository, trying to reuse as much code as possible from this repository. Additional references:

  • Here you can find a dispatcher of analyzers (when you will be reimplementing controller, this is the place where you can add yourown analyzers):
  • And here you can find an example how to add yourown DataCalculators.

The analysis of this service is supposed to be heavily dependent on the bootflow package of converged-security-suite. This package is designed to be dependency-injectable, so when implementing an Analyzer feel free to define yourown Flow. <...the documentation how to do that is to be developed...>

Design

Network components

AttestationFailureAnalysisService implies dependency injection of major components, and the resulting scheme could be different. But at average it could be something like this: afas_components.png

For dummy demonstration there is a docker-compose.yml file, which brings up a scheme similar to shown above, but there afasd accesses directly the firmware tables (and they are stored in the same database "afasd") and uses nginx to access the FileStorage (for simplicity of the demonstration).

Analysis batching

To satisfy reasonable SLA for single analysis request (addressed to multiple Analyzers) we batch analyzers requests together.

analysis_batching.png

So in a nutshell:

  1. The client puts all artifacts into a list of artifacts. Then enumerates the list of Analyzer-s it wishes to run and defines the artifacts indexes as inputs to Analyzer-s (so that analyzers reuse the same artifacts and there is no need to send the same 64MiB image too all analyzers separately).
  2. The client sends the list of artifacts with the list of analyzer inputs to the server.
  3. The server finds an analyzer for each analyzer input and compiles an input for it.
  4. The server runs all analyzers concurrently.
  5. The server gathers reports from all analyzers and returns back to the client (and also stores to the DB).

The "compiles" from point #3 above involves tree-like value resolution. For example, a couple of analyzers wants to see parsed original firmware, aligned with the actual firmware. But the client provided a binary image of the actual firmware and only a firmware version to find the original firmware. And converting the raw inputs to the required inputs -- is an expensive computation. So it is performed only once (during a single request) and then is fed to multiple analyzers (who requested that):

inputs_flow.png

Analyzer-s were supposed to be able to re-use each other results. But for simplicity of the implementation the Analyzer-s topology is flat and tree-structure is created only for DataConverter-s. May be in v2.0 it will be fixed.

Dependency injection

One of the major priorities of the project is dependency injection. Currently:

  • Analyzers are injectable if controller is reimplemented. This problem was supposed to be solved, but for simplicity of the implementation this hacky solution was used for now.
  • Network components are injectable as is.
  • Observability API is injectable as is.
  • Hardware security analysis logic is hardware injectable as is.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages