Coder Social home page Coder Social logo

Comments (11)

heyams avatar heyams commented on August 28, 2024 1

@franden I haven't had a chance to test it yet. Will get back to you soon. Thanks for reminding me.

from applicationinsights-java.

heyams avatar heyams commented on August 28, 2024 1

@franden we have updated the azure doc: https://learn.microsoft.com/en-us/azure/azure-monitor/app/monitor-functions#configuration. We decided not to make that env var available because of the pre-warm up pool.

from applicationinsights-java.

franden avatar franden commented on August 28, 2024

@heyams any idea how to solve the problem

from applicationinsights-java.

heyams avatar heyams commented on August 28, 2024

@franden which plan are you using? consumption plan doesn't support this env var.

from applicationinsights-java.

franden avatar franden commented on August 28, 2024

@heyams thank you for your reply.

@franden which plan are you using? consumption plan doesn't support this env var.

Yes I awe are using the consumption plan. I can not find any documentation that this env var is not supported for consumption plan 🤔
But then, how I can pass the applicationinsights.json to my function? According to the documentation this file must be located in the same directory as applicationinsights-agent.jar. If the "Collection level" is set to Recommended, agent jar is attached automatically. I am even not aware where the jar is located.

from applicationinsights-java.

heyams avatar heyams commented on August 28, 2024

@franden we can introduce a new env var for you to use similar to this

something like APPLICATIONINSIGHTS_INSTRUMENTATION_VERTX_ENABLE. will that work? This requires Azure function team to integrate with a newer version of java agent. It might take a while to get to production. Alternatively, I think there is a way to pass in a system property via the jvm arguments. Let me find out.

from applicationinsights-java.

heyams avatar heyams commented on August 28, 2024

@franden Azure Functions has an instruction how to use a custom version of java agent.
You can deploy json config along with the java agent in the same path.

https://github.com/Azure/azure-functions-java-worker/wiki/Distributed-Tracing-for-Java-Azure-Functions#customize-distribute-agent can you try it and let me know if that works? I haven't test it using consumption plan and I've reached out to functions team to confirm. system property I mentioned above won't work because consumption plan uses pool of pre-warmed up Java processes.

from applicationinsights-java.

franden avatar franden commented on August 28, 2024

https://github.com/Azure/azure-functions-java-worker/wiki/Distributed-Tracing-for-Java-Azure-Functions#customize-distribute-agent can you try it and let me know if that works?

@heyams that worked for me :-) thank you.
This approach requires additional configuration in pom.xml it would be great if either env var APPLICATIONINSIGHTS_CONFIGURATION_CONTENT or something like APPLICATIONINSIGHTS_INSTRUMENTATION_VERTX_ENABLE would be supported by consumption plan

from applicationinsights-java.

franden avatar franden commented on August 28, 2024

@franden we have updated the azure doc: https://learn.microsoft.com/en-us/azure/azure-monitor/app/monitor-functions#configuration. We decided not to make that env var available because of the pre-warm up pool.

😢packaging of custom application insights agent means additional complexity for the build configuration.

I don't get the issue with the pre-warm up pool. Are there pre-warmed JVMs which already have attached application insights agents, so that it is not possible to add additional configuration?

from applicationinsights-java.

heyams avatar heyams commented on August 28, 2024

@franden it's a limitation of the consumption plan.
this is how it works currently:

-> consumption plan

  • azure function worker will attach java agent during startup. java agent is not enabled yet
  • java agent starts up, which will process all the opt in feature on instrumentations for OpenTelemetry
  • azure function host gets pinged every 1 min for any new azure function code available
  • when new azure function code is detected, host will call azure function worker, which will trigger a specialization request using the pre-warm pool, that is when java agent is enabled.
  • then java agent will detect it and initialize all the config for the azure function instrumentation

at this point, it's too late to opt in OpenTelemetry instrumentation otel.instrumentation.vertx.enabled. That is why even if we add a new env var for it, it doesn't work. I hope this helps.

The workaround we provide in our docs, using languageWorkers__java__arguments allows azure function host to restart azure function worker and then re-initializes java agent. that explains why it works for your case.

from applicationinsights-java.

microsoft-github-policy-service avatar microsoft-github-policy-service commented on August 28, 2024

This issue has been automatically marked as stale because it has been marked as requiring author feedback but has not had any activity for 7 days. It will be closed if no further activity occurs within 7 days of this comment.

from applicationinsights-java.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.