Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash enrich at startup if Sentry DSN is not correct #388

Closed
benjben opened this issue Oct 30, 2020 · 4 comments
Closed

Crash enrich at startup if Sentry DSN is not correct #388

benjben opened this issue Oct 30, 2020 · 4 comments
Labels
bug Something isn't working data-loss
Milestone

Comments

@benjben
Copy link
Contributor

benjben commented Oct 30, 2020

At the moment enrich fails only when an event is to be sent to Sentry, e.g. :

Error message from worker: java.lang.IllegalArgumentException: Illegal character in scheme name at index 0: "https://deadbeef26784ae994a8dfc9e42441f0@sentry.snplow.net/19?stacktrace.app.packages=com.snowplowanalytics.snowplow.enrich.beam&tags=cloud:GCP,pipeline_name:prod1,client_name:com_acme,region:europe-west3&release=1.3.2&async=false" java.net.URI.create(URI.java:852) io.sentry.dsn.Dsn.<init>(Dsn.java:41) io.sentry.SentryClientFactory.instantiateFrom(SentryClientFactory.java:114) io.sentry.SentryOptions.<init>(SentryOptions.java:44) io.sentry.SentryOptions.from(SentryOptions.java:89) io.sentry.SentryOptions.defaults(SentryOptions.java:116) io.sentry.SentryOptions.defaults(SentryOptions.java:100) io.sentry.Sentry.getStoredClient(Sentry.java:168) io.sentry.Sentry.capture(Sentry.java:235) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrich$2(Enrich.scala:301) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrich$2$adapted(Enrich.scala:299) scala.Option.foreach(Option.scala:407) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.enrich(Enrich.scala:299) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrichEvents$2(Enrich.scala:211) com.snowplowanalytics.snowplow.enrich.beam.utils$.timeMs(utils.scala:151) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrichEvents$1(Enrich.scala:207) com.spotify.scio.util.Functions$$anon$7.processElement(Functions.scala:263) Caused by: java.net.URISyntaxException: Illegal character in scheme name at index 0: "https://deadbeef6784ae994a8dfc9e42441f0@sentry.snplow.net/19?stacktrace.app.packages=com.snowplowanalytics.snowplow.enrich.beam&tags=cloud:GCP,pipeline_name:prod1,client_name:com_acme,region:europe-west3&release=1.3.2&async=false" java.net.URI$Parser.fail(URI.java:2848) java.net.URI$Parser.checkChars(URI.java:3021) java.net.URI$Parser.checkChar(URI.java:3031) java.net.URI$Parser.parse(URI.java:3047) java.net.URI.<init>(URI.java:588) java.net.URI.create(URI.java:850) io.sentry.dsn.Dsn.<init>(Dsn.java:41) io.sentry.SentryClientFactory.instantiateFrom(SentryClientFactory.java:114) io.sentry.SentryOptions.<init>(SentryOptions.java:44) io.sentry.SentryOptions.from(SentryOptions.java:89) io.sentry.SentryOptions.defaults(SentryOptions.java:116) io.sentry.SentryOptions.defaults(SentryOptions.java:100) io.sentry.Sentry.getStoredClient(Sentry.java:168) io.sentry.Sentry.capture(Sentry.java:235) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrich$2(Enrich.scala:301) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrich$2$adapted(Enrich.scala:299) scala.Option.foreach(Option.scala:407) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.enrich(Enrich.scala:299) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrichEvents$2(Enrich.scala:211) com.snowplowanalytics.snowplow.enrich.beam.utils$.timeMs(utils.scala:151) com.snowplowanalytics.snowplow.enrich.beam.Enrich$.$anonfun$enrichEvents$1(Enrich.scala:207) com.spotify.scio.util.Functions$$anon$7.processElement(Functions.scala:263) com.spotify.scio.util.Functions$$anon$7$DoFnInvoker.invokeProcessElement(Unknown Source) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:227) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:186) org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335) org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44) org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49) org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:267) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:79) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:413) org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:73) org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:139) org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:227) org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:186) org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335) org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44) org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49) org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201) org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159) org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77) org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1369) org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:154) org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1088) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:748)
@benjben benjben added bug Something isn't working data-loss labels Oct 30, 2020
@chuwy chuwy added this to the 1.5.0 milestone Oct 30, 2020
@benjben
Copy link
Contributor Author

benjben commented Oct 30, 2020

The plan is to:

  1. When enrich starts, try to parse Sentry DSN with URI.create() and crash if it doesn't work
  2. If sending to Sentry doesn't work, print an ERROR to the log

@benjben
Copy link
Contributor Author

benjben commented Nov 10, 2020

Regarding 2), this is already handled by Sentry API :

  /**
     * Sends a built {@link Event} to the Sentry server.
     *
     * @param event event to send to Sentry.
     */
    public void sendEvent(Event event) {
        if (event == null) {
            return;
        }

        for (ShouldSendEventCallback shouldSendEventCallback : shouldSendEventCallbacks) {
            if (!shouldSendEventCallback.shouldSend(event)) {
                logger.trace("Not sending Event because of ShouldSendEventCallback: {}", shouldSendEventCallback);
                return;
            }
        }

        try {
            connection.send(event);
        } catch (LockedDownException | TooManyRequestsException e) {
            logger.debug("Dropping an Event due to lockdown: " + event);
        } catch (RuntimeException e) {
            logger.error("An exception occurred while sending the event to Sentry.", e);
        } finally {
            getContext().setLastEventId(event.getId());
        }
    }

@chuwy
Copy link
Contributor

chuwy commented Nov 19, 2020

Hey @benjben! Can we rename this commit to add Common: prefix as it's common across all assets.

I think it was a wrong decision on my side to use these prefixes everywhere as Common is very ambigous and confusing - we need to either abandon them or come up with better convention, but just don't want to break it on a patch release.

@benjben
Copy link
Contributor Author

benjben commented Nov 19, 2020

I was a bit embarrassed when creating this issue for the reason that you say. I didn't add Common: because this deals only with Stream Enrich and Beam Enrich. I should create one issue for each then. Doing it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working data-loss
Projects
None yet
Development

No branches or pull requests

2 participants