Our Jenkins server went down for a few hours without us noticing. In the mean time a number of changes had been opened on our Gerrit instance. Obviously, none of these changes were being automatically built until we manually poked at it from Jenkins.
Propose adding a configuration option to add named event channels which are buffered on disk.
For example, we could configure a 'jenkins' event channel, which could be as simple as writing the JSON events to a flat file named 'jenkins' somewhere. When you call 'stream-events --channel=jenkins' the buffered events are first consumed before streaming events live. This allows Jenkins to catch up on missed events.
An alternative would be to assign a sequential serial number to each event, and allow stream-events callers to specify a --from=123456789 argument, but that would require some kind of log rotation / cleanup.
Happy to hack on this if a maintainer can confirm the idea is good. I hear there are complications with regards to replication / multi master, but in our scenario we only have a single instance and we need it to be a bit more resilient.
Thanks
Alex
There have been several discussions about how stream events is frail. I think the proposed solution was to either add a from={id} argument, and send that argument with each stream event, or add a from={datetime} argument. Shawn has recommended that we (build bot authors) consider using the REST API, which already supports a from={datetime} type argument. The downside here is that REST is not event based - you must poll the server. I think the Gerrit maintainers would be OK with adding a from= argument to stream events. It might also be worth investigating moving stream events to a plugin..?