Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
This crate provides a streaming CSV (comma separated values) writer and
reader that works with the serialize
crate to do type based encoding
and decoding. There are two primary goals of this project:
- The default mode of parsing should just work. This means the parser will bias toward providing a parse over a correct parse (with respect to RFC 4180).
- Convenient to use by default, but when performance is needed, the API will provide an escape hatch.
Licensed under the UNLICENSE.
Documentation
The API is fully documented with lots of examples: http://burntsushi.net/rustdoc/csv/.
Simple examples
Here is a full working Rust program that decodes records from a CSV file. Each record consists of two strings and an integer (the edit distance between the strings):
extern crate csv;
use Path;
Don't like tuples? That's fine. Use a struct instead:
extern crate csv;
extern crate serialize;
use Path;
Do some records not have a distance for some reason? Use an Option
type!
You can also read CSV headers, change the delimiter, use enum
types or just
get plain access to records as vectors of strings. There are examples with more
details in the documentation.
Installation
This crate works with Cargo. Assuming you have Rust and Cargo installed, simply check out the source and run tests:
You can also add csv
as a dependency to your project's Cargo.toml
using
this git repo:
[]
= "git://github.com/BurntSushi/rust-csv"
Or, you can use csv
from crates.io:
[]
= "*"
For now, I'd probably recommend that you use the git repository while we're all
still tracking nightly, but I'll try to keep the crates.io
package updated.
Benchmarks
There are some rough benchmarks (compared with Go) here: https://github.com/BurntSushi/rust-csv/tree/master/bench