Read HTTP Archives with Java.
<dependency>
<groupId>de.sstoehr</groupId>
<artifactId>har-reader</artifactId>
<version>2.4.1</version>
</dependency>
Reading HAR from File:
HarReader harReader = new HarReader();
Har har = harReader.readFromFile(new File("myhar.har"));
System.out.println(har.getLog().getCreator().getName());
Reading HAR from String:
HarReader harReader = new HarReader();
Har har = harReader.readFromString("{ ... HAR-JSON-Data ... }");
Some HAR generators use date formats, which are not according to the specification. You can tell HAR reader to ignore those fields instead of throwing an exception:
HarReader harReader = new HarReader();
Har har = harReader.readFromFile(new File("myhar.har"), HarReaderMode.LAX);
Har har = harReader.readFromString("{ ... HAR-JSON-Data ... }", HarReaderMode.LAX);
You can also follow the next section and configure your own mapping configuration to deal with these fields.
Writing HAR to File:
Har har = new Har();
HarWriter harWriter = new HarWriter();
harWriter.writeTo(new File("myhar.har"), har);
Writing HAR to OutputStream:
Har har = new Har();
HarWriter harWriter = new HarWriter();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
harWriter.writeTo(baos, har);
Writing HAR to Writer:
Har har = new Har();
HarWriter harWriter = new HarWriter();
StringWriter sw = new StringWriter();
harWriter.writeTo(sw, har);
Writing HAR as bytes:
Har har = new Har();
HarWriter harWriter = new HarWriter();
byte[] harBytes = harWriter.writeAsBytes(har);
As of version 2.0.0 you can create your own MapperFactory
(DefaultMapperFactory)
public class MyMapperFactory implements MapperFactory {
public ObjectMapper instance(HarReaderMode mode) {
ObjectMapper mapper = new ObjectMapper();
SimpleModule module = new SimpleModule();
// configure Jackson object mapper as needed
mapper.registerModule(module);
return mapper;
}
}
You can now use your configuration by instantiating the HarReader
with your MapperFactory
:
HarReader harReader = new HarReader(new MyMapperFactory());
- Changes see 2.4.0
- Fixed issue introduced with 2.4.0 with duplicate fields
- Updated dependencies
- Added support for unknown HTTP methods or status codes
- Added support to serialize HAR data back to JSON
- Updated dependencies
- Requires Java 8 or later: dropped support for Java 7
- Updated dependencies
- #82: Make sure default values from HAR entities satisfies specification
- Updated dependencies
- Added support for fields, which are not supported in official spec. You can access these fields using
Map<String, Object> getAdditional()
- Updated dependencies
- Updated dependencies
This is the first release, which is provided both on GitHub and Maven Central repository.
- Updated dependencies
- Updated dependencies
- Updated dependencies
- Updated dependencies
- Updated dependencies
- Updated dependencies (CVE-2018-7489)
- Added support for several HTTP status codes, e.g. (308, 422 - 451, 505 - 511)
- Added support for HTTP method:
PATCH
- You can now access additional fields, which are not part of the HAR spec:
response.getAdditional().get("_transferSize");
- Added equals and hashCode methods
- Added CCM_POST HttpMethod to enum
- Ignore invalid integers in lax mode
- HAR reader is now easier customizable. Use your own
MapperFactory
to adjust HAR reader for your project! - HAR reader threw exceptions, when required fields were empty. This behaviour was changed, so that you can now read non-standard-compliant HAR files