Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 01/24/2021 in all areas

  1. Hi Everyone, I'm a data scientist by trade and have been considering throwing together a script that could aid in dialing in injector slopes, breakpoint and deadtime. The input would be a reasonable size datalog of AFR, injector pulse widths and such. Using some non-linear optimization, it should be possible to identify what variables need to be changed to minimize lambda errors. It should be possible to differentiate speed density errors from injector scaling errors, as the lambda errors will track with injector pulse width. I've only just got my wideband sensor wor
    1 point
  2. Honestly it all sounds great and I think if you can put the time in it would work. I'd be happy to ask some tuners who might be willing to collect some data for you if you are serious. The only concern is this sounds like a year long PHD project. Do you actually have the time to do this? If you do we will do our best to support you as there are many other applications I can think of that might have a business case behind them if it can be done in a short time period.
    1 point
  3. That's some good feedback. As you pointed out, there are quite a few assumptions and pit falls when it comes to this. However, many will also apply to manually dialing in injectors with a wideband. So I don't necessarily see them as issues directly preventing an automation of the process - but that doesn't mean we should ignore them! Differentiating speed density errors from injector scaling errors is an interesting point and I'll admit, something I've only considered from a theoretical standpoint. I would like to break it down to its simplest concept - there can be errors in lambda
    1 point
×
×
  • Create New...