aboutsummaryrefslogtreecommitdiff
path: root/wiki/StateOfPlay.trac
blob: f4178a2254738edc3cb49c125443fabfdad5999e (plain) (blame)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
[[PageOutline]]

= A Completely Informal Snapshot Of The Current State Of The Cryptech Project As Of 2014-11-06 =

This page contains a snapshot of the status in the project and will almost certainly be obsolete by the time you read it.  If you find something that's wrong, please fix it!


== Cores ==
We have a bunch of cores, primarily for FPGA implementation.  Some of them implement cryptographic
algorithms or critical functionality like the TRNG. Other cores are support cores for implementation of the Cryptech HSM. Other cores are for developing the cores and the HW. Finally some are just test
code.

Cores that have been promoted to official cryptech HW cores:

* `core/chacha` - The !ChaCha stream cipher
* `core/sha1` - FIPS 180-2 SHA-1 hash
* `core/sha256` - FIPS 180-4 SHA-256 hash
* `core/sha512` - FIPS 180-4 SHA-512/x hash
* `core/trng` - The Cryptech TRNG sub system. Uses !ChaCha, SHA-512 and entropy cores.
* `core/avalanche_entropy` - Avalanche entropy provider core. Requires external avalanche noise source.
* `core/rosc_entropy` - Digital ring oscillator based entropy provider core.

Utility, test, board support:

* `core/coretest` - Core for performing command/response operations to drive testing of a core.
* `core/coretest_hashes` - Subsysem with coretest and the hash function cores as test objects.
* `core/coretest_test_core` - Coretest with a simple test core
* `core/i2c` - I2C interface core.
* `core/novena`
* `core/novena_eim`
* `core/novena_i2c_simple`
* `core/test_core`
* `core/uart`- UART interface core to allow serial communication with FPGA functionality.
* `core/vndecorrelator` - von Neumann decorrelation core.

Documentation is very haphazard: some of the repositories have
detailed README.md files, but in many cases the documentation, what
there is of it, is probably meaningful only to the person who wrote
it, not because of any lack of good intent, just because what's
written assumes that the reader knows everything that the author does
about the other cores, the rest of the environment, and how everything
fits together.

== Builds ==

At this point I have figured out how to build two different FPGA
images for the Novena PVT1.  In both cases, I'm using the Makefile
rather than attempting to use the XiLinx GUI environment.

* `core/novena` builds the current set of digest cores into a
  framework that uses the "coretest" byte stream protocol over an I2C
  bus.

* `core/novena_i2c_simple` builds the current set of digest cores into
  a framework that uses a simplfied write()/read() API over an I2C bus.

There's a third build, `core/novena_eim`, which was only just updated
today, and which is reported as not quite stable yet.  Will try
building it soon and report here.

Both working builds (and, almost certainly, any useful build) involve
more than just the named repository.  `verilator`, when asked nicely,
will draw a graph of Verilog module relationships.  Take this with
salt, as I am a long way from getting `verilator` to run cleanly on
any of this, but the current graphs may still be useful in visualizing
what's happening here.

At least some of the modules that `verilator` complains about not
being able to find appear to come from !XiLinx libraries that
`verilator` doesn't know about.
See [[http://www.xilinx.com/support/documentation/sw_manuals/xilinx12_1/spartan6_hdl.pdf|Spartan-6 Libraries Guide for HDL Designs]] for details.

=== Module relationships in core/novena build ===

[[Image(novena__linkcells.svg)]]

=== Module relationships in core/novena_i2c_simple build ===

[[Image(novena_i2c_simple__linkcells.svg)]]

=== Module relationships in core/novena_eim build ===

[[Image(novena_eim__linkcells.svg)]]

=== Module relationships in cores/trng build ===

By special request, here's a graph for the TRNG too, even though we
don't yet have a way to speak to it from the Novena:

[[Image(trng__linkcells.svg)]]

== C Code ==

Most of the cores have at least minimal test frameworks, written in a
combination of Verilog, C, and Python, but there's also a preliminary
port of Cryptlib to the Cryptech environment, in `sw/cryptlib`.  As of
this writing, the only Cryptech-specific features of this port, other
than a few makefile tricks, are:

* A set of HALs that make use of the `core/novena` and
  `core/novena_i2c_simple` FPGA builds, using the Linux /dev/i2c
  device interface; and

* Another Python script to test the resulting Cryptlib build, using
  the stock Cryptlib Python bindings.

No HAL for `core/novena_eim` yet.

The Cryptlib Python bindings build kind of slowly on the Novena, sorry
about that.

== Hardware ==

The hardware guys have done cool stuff with hardware entropy sources.
I even have one of the noise boards, but until I have some way to
connect C code to the TRNG, I don't have much use for it other than to
admire the craftsmanship.  Soon, I hope.

== Tools ==

Already mentioned `verilator`.  In addition to generating !GraphViz
input, `verilator` has a `--lint` mode which looks interesting.

(JS) Verilator is fairly usable, at least as a linter. Adding `-Wall` provides more warnings.
Since we at least uses Icarus Verilog (iverilog), Altera Quartus and Xilinx ISE one would assume that they would provide all possible warnings. That is not the case. They all seem to fins different things to warn about. And Verilator provides even more. The more parsers and checkers the better. But we will not be able to, or want to fix all warnings. Some things are by design. We should probably document what we ignore.

I haven't yet figured out whether we have any real use for
`verilator`'s core function of synthesizing Verilog into C++.  I've
been toying with the idea of a software-only development environment,
where one simulates an embedded machine using two Unix processes: one
would be a virtual FPGA generated by `verilator`, the other would be a
classical deeply embedded system running as a single process.  The two
processes would communicate via a `PF_UNIX` socket or something on
that order.  It might be possible to jam everything into a single
process, but I suspect it wouldn't be worth the trouble.

Joachim has Makefiles which use `iverilog` to generate simulation
images.  Installing `iverilog` is easy enough (`apt-get install`, etc)
but I haven't yet figured out how to do anything interesting with the
simulation images.  Joachim replies:

    There is help in the Makefile.  You run the targets, either as
    make sim-foo or just ./foo.sim.  Most if not all tests are self
    testing with test cases and should report number of test cases and
    how many passed.  Which should be all.

As far as I know we've done nothing yet to deal with threats to the
tool chain (Thompson attack, etc).