System Logs

logs/dev_detailed.log
Timestamp Level Source Processor Process ID Message Args
2026-03-26 10:32:31,917 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 429b5f2f-e1f6-45dc-a7ba-ce95de83e0bf Transformer Init Timings
{"total_init_ms": 205.43, "super_init_ms": 110.47, "config_load_ms": 16.61, "client_init_ms": 0.03, "transcriber_init_ms": 78.02}
2026-03-26 10:32:31,916 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 429b5f2f-e1f6-45dc-a7ba-ce95de83e0bf Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-26 10:32:31,712 DEBUG base_processor.py:init_logger:386 TransformerProcessor 429b5f2f-e1f6-45dc-a7ba-ce95de83e0bf Logger initialisiert
2026-03-26 10:32:16,759 DEBUG transformer_processor.py:__init__:199 TransformerProcessor e64fdb43-0d97-4b3f-bac1-1991d52ff5e4 Transformer Init Timings
{"total_init_ms": 212.78, "super_init_ms": 115.76, "config_load_ms": 16.98, "client_init_ms": 0.01, "transcriber_init_ms": 79.73}
2026-03-26 10:32:16,759 DEBUG transformer_processor.py:__init__:179 TransformerProcessor e64fdb43-0d97-4b3f-bac1-1991d52ff5e4 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-26 10:32:16,547 DEBUG base_processor.py:init_logger:386 TransformerProcessor e64fdb43-0d97-4b3f-bac1-1991d52ff5e4 Logger initialisiert
2026-03-26 09:18:23,398 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 3287.0373725891113}
2026-03-26 09:18:23,398 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091823_translation.json", "tokens": 903}
2026-03-26 09:18:23,397 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 903, "duration": 3287.0373725891113, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:18:23,396 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:18:19,744 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:18:19,360 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 3958.2488536834717}
2026-03-26 09:18:19,360 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091819_translation.json", "tokens": 1430}
2026-03-26 09:18:19,359 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1430, "duration": 3958.2488536834717, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:18:19,358 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:18:15,021 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:18:14,614 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 3586.7714881896973}
2026-03-26 09:18:14,613 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091814_translation.json", "tokens": 1348}
2026-03-26 09:18:14,612 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1348, "duration": 3586.7714881896973, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:18:14,611 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:18:10,682 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:18:10,325 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 5, "duration_ms": 11446.587085723877, "tokens": 3345, "text_length": 2869}
2026-03-26 09:18:10,325 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:18:09,624 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 6, "duration_ms": 10391.811847686768, "tokens": 3363, "text_length": 3134}
2026-03-26 09:18:09,623 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:18:07,990 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 7, "duration_ms": 8375.763654708862, "tokens": 3094, "text_length": 1817}
2026-03-26 09:18:07,990 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:59,614 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 7, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_7.mp3"}
2026-03-26 09:17:59,230 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 6, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_6.mp3"}
2026-03-26 09:17:58,878 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 5, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_5.mp3"}
2026-03-26 09:17:58,481 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 4341.109275817871}
2026-03-26 09:17:58,481 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091758_translation.json", "tokens": 1661}
2026-03-26 09:17:58,480 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1661, "duration": 4341.109275817871, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:17:58,480 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:53,790 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:17:53,414 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 6406.482696533203}
2026-03-26 09:17:53,414 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091753_translation.json", "tokens": 1591}
2026-03-26 09:17:53,413 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1591, "duration": 6406.482696533203, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:17:53,412 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:46,598 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:17:46,241 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 7245.054721832275}
2026-03-26 09:17:46,241 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091746_translation.json", "tokens": 1576}
2026-03-26 09:17:46,241 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1576, "duration": 7245.054721832275, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:17:46,240 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:38,617 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:17:38,256 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 5800.617456436157}
2026-03-26 09:17:38,256 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091738_translation.json", "tokens": 1417}
2026-03-26 09:17:38,255 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1417, "duration": 5800.617456436157, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:17:38,255 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:32,123 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:17:31,793 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Übersetzung abgeschlossen
{"duration_ms": 4779.660224914551}
2026-03-26 09:17:31,793 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_091731_translation.json", "tokens": 1531}
2026-03-26 09:17:31,792 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 LLM Request erstellt
{"purpose": "translation", "tokens": 1531, "duration": 4779.660224914551, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 09:17:31,792 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:26,573 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Übersetzung von nach en
2026-03-26 09:17:26,237 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 4, "duration_ms": 11725.780725479126, "tokens": 3503, "text_length": 3491}
2026-03-26 09:17:26,236 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:25,909 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 3, "duration_ms": 11743.066310882568, "tokens": 3450, "text_length": 3506}
2026-03-26 09:17:25,909 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:25,565 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 2, "duration_ms": 11760.74194908142, "tokens": 3414, "text_length": 3314}
2026-03-26 09:17:25,564 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:25,210 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 1, "duration_ms": 11734.145402908325, "tokens": 3380, "text_length": 3128}
2026-03-26 09:17:25,209 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:24,882 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transkription erfolgreich
{"segment_id": 0, "duration_ms": 11720.7612991333, "tokens": 3403, "text_length": 3258}
2026-03-26 09:17:24,881 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 1 LLM-Requests hinzugefügt
2026-03-26 09:17:14,510 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 4, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_4.mp3"}
2026-03-26 09:17:14,165 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 3, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_3.mp3"}
2026-03-26 09:17:13,803 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 2, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_2.mp3"}
2026-03-26 09:17:13,475 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 1, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_1.mp3"}
2026-03-26 09:17:13,160 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Starte Transkription von Segment
{"segment_id": 0, "segment_title": null, "file_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_0.mp3"}
2026-03-26 09:17:12,847 INFO transcription_utils.py:transcribe_segments:2175 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 TRANSKRIPTION-DEBUG: Starte parallele Transkription von 1 Segmenten/Kapiteln
2026-03-26 09:17:12,831 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 8/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_7.mp3"}
2026-03-26 09:17:11,813 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 7/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_6.mp3"}
2026-03-26 09:17:10,805 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 6/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_5.mp3"}
2026-03-26 09:17:09,785 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 5/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_4.mp3"}
2026-03-26 09:17:08,766 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 4/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_3.mp3"}
2026-03-26 09:17:07,749 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 3/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_2.mp3"}
2026-03-26 09:17:06,788 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 2/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_1.mp3"}
2026-03-26 09:17:05,808 DEBUG audio_processor.py:get_audio_segments:553 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 Teil 1/8 erstellt
{"duration_sec": 264.867, "segment_path": "cache/audio/temp/0266301a17e2fa83d2e1647b5e60cd1e/chapter_0/segment_0.mp3"}
2026-03-26 09:17:04,443 INFO audio_processor.py:get_audio_segments:515 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Kapitel 1 zu lang, teile es auf
{"duration_minutes": 35.31561666666666, "max_duration_minutes": 5.0}
2026-03-26 09:17:02,235 DEBUG audio_processor.py:_create_cache_key:665 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Cache-Schlüssel erstellt: Bwiki Gespräch alex aryan.m4a|lang=en
2026-03-26 09:17:01,588 INFO audio_processor.py:__init__:225 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Gesamte Initialisierungszeit: 361.22 ms
2026-03-26 09:17:01,521 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transformer Init Timings
{"total_init_ms": 182.82, "super_init_ms": 85.67, "config_load_ms": 16.89, "client_init_ms": 0.01, "transcriber_init_ms": 79.93}
2026-03-26 09:17:01,521 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-26 09:17:01,228 DEBUG base_processor.py:init_logger:386 AudioProcessor job-f6c48f50-72d2-4dc1-8f2d-e3b5f9864fe2 Logger initialisiert
2026-03-26 06:31:42,057 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Übersetzung abgeschlossen
{"duration_ms": 1813.1132125854492}
2026-03-26 06:31:42,057 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_063142_translation.json", "tokens": 324}
2026-03-26 06:31:42,057 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb LLM Request erstellt
{"purpose": "translation", "tokens": 324, "duration": 1813.1132125854492, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 06:31:42,056 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb 1 LLM-Requests hinzugefügt
2026-03-26 06:31:39,901 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Starte Übersetzung von nach de
2026-03-26 06:31:39,555 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Transkription erfolgreich
{"segment_id": 0, "duration_ms": 6340.77000617981, "tokens": 1665, "text_length": 705}
2026-03-26 06:31:39,554 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb 1 LLM-Requests hinzugefügt
2026-03-26 06:31:33,213 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Starte Transkription von Segment
{"segment_id": 0, "segment_title": null, "file_path": "cache/audio/temp/43b0fc33695d2252d85fc38ada6ec124/chapter_0/full.mp3"}
2026-03-26 06:31:32,893 INFO transcription_utils.py:transcribe_segments:2175 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb TRANSKRIPTION-DEBUG: Starte parallele Transkription von 1 Segmenten/Kapiteln
2026-03-26 06:31:32,893 DEBUG audio_processor.py:get_audio_segments:580 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Kapitel 1 als einzelnes Segment erstellt
{"duration_sec": 151.162, "segment_path": "cache/audio/temp/43b0fc33695d2252d85fc38ada6ec124/chapter_0/full.mp3"}
2026-03-26 06:31:32,153 DEBUG audio_processor.py:_create_cache_key:665 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Cache-Schlüssel erstellt: Sprache 006.m4a|lang=de
2026-03-26 06:31:31,505 INFO audio_processor.py:__init__:225 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Gesamte Initialisierungszeit: 361.13 ms
2026-03-26 06:31:31,435 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Transformer Init Timings
{"total_init_ms": 184.04, "super_init_ms": 85.44, "config_load_ms": 16.71, "client_init_ms": 0.01, "transcriber_init_ms": 81.54}
2026-03-26 06:31:31,435 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-26 06:31:31,145 DEBUG base_processor.py:init_logger:386 AudioProcessor job-47989ebe-2ee6-4a61-8d10-a9ba375d44fb Logger initialisiert
2026-03-26 06:30:00,326 INFO transcription_utils.py:translate_text:722 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Übersetzung abgeschlossen
{"duration_ms": 1203.8602828979492}
2026-03-26 06:30:00,326 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 LLM Interaktion gespeichert
{"file": "cache/audio/temp/debug/llm/20260326_063000_translation.json", "tokens": 296}
2026-03-26 06:30:00,326 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 LLM Request erstellt
{"purpose": "translation", "tokens": 296, "duration": 1203.8602828979492, "model": "google/gemini-2.5-flash", "processor": "AudioProcessor"}
2026-03-26 06:30:00,325 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 1 LLM-Requests hinzugefügt
2026-03-26 06:29:58,756 INFO transcription_utils.py:translate_text:632 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Starte Übersetzung von nach en
2026-03-26 06:29:58,438 INFO transcription_utils.py:transcribe_segment:2025 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Transkription erfolgreich
{"segment_id": 0, "duration_ms": 5339.659929275513, "tokens": 1661, "text_length": 675}
2026-03-26 06:29:58,437 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 1 LLM-Requests hinzugefügt
2026-03-26 06:29:53,097 INFO transcription_utils.py:transcribe_segment:1793 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Starte Transkription von Segment
{"segment_id": 0, "segment_title": null, "file_path": "cache/audio/temp/43b0fc33695d2252d85fc38ada6ec124/chapter_0/full.mp3"}
2026-03-26 06:29:52,793 INFO transcription_utils.py:transcribe_segments:2175 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 TRANSKRIPTION-DEBUG: Starte parallele Transkription von 1 Segmenten/Kapiteln
2026-03-26 06:29:52,792 DEBUG audio_processor.py:get_audio_segments:580 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Kapitel 1 als einzelnes Segment erstellt
{"duration_sec": 151.162, "segment_path": "cache/audio/temp/43b0fc33695d2252d85fc38ada6ec124/chapter_0/full.mp3"}
2026-03-26 06:29:51,991 DEBUG audio_processor.py:_create_cache_key:665 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Cache-Schlüssel erstellt: Sprache 006.m4a|lang=en
2026-03-26 06:29:51,221 INFO audio_processor.py:__init__:225 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Gesamte Initialisierungszeit: 405.48 ms
2026-03-26 06:29:51,153 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Transformer Init Timings
{"total_init_ms": 188.56, "super_init_ms": 89.66, "config_load_ms": 17.31, "client_init_ms": 0.01, "transcriber_init_ms": 81.27}
2026-03-26 06:29:51,153 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-26 06:29:50,816 DEBUG base_processor.py:init_logger:386 AudioProcessor job-cf6fbce0-8a0f-4c73-96de-10ccd6b3f943 Logger initialisiert
2026-03-25 19:41:12,851 DEBUG transformer_processor.py:__init__:199 TransformerProcessor a508c930-f7f3-4833-8957-4cf03cdc27db Transformer Init Timings
{"total_init_ms": 225.75, "super_init_ms": 107.11, "config_load_ms": 30.54, "client_init_ms": 0.02, "transcriber_init_ms": 87.76}
2026-03-25 19:41:12,851 DEBUG transformer_processor.py:__init__:179 TransformerProcessor a508c930-f7f3-4833-8957-4cf03cdc27db Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 19:41:12,626 DEBUG base_processor.py:init_logger:386 TransformerProcessor a508c930-f7f3-4833-8957-4cf03cdc27db Logger initialisiert
2026-03-25 19:30:39,043 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 828ba2cd-d492-4f60-a65d-e4d3e2ee6340 Transformer Init Timings
{"total_init_ms": 227.73, "super_init_ms": 105.93, "config_load_ms": 17.51, "client_init_ms": 0.01, "transcriber_init_ms": 103.95}
2026-03-25 19:30:39,043 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 828ba2cd-d492-4f60-a65d-e4d3e2ee6340 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 19:30:38,816 DEBUG base_processor.py:init_logger:386 TransformerProcessor 828ba2cd-d492-4f60-a65d-e4d3e2ee6340 Logger initialisiert
2026-03-25 19:30:37,205 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor f1a20818-cdc9-49eb-91bc-214b3753a588 Embedding (client) abgeschlossen
{"extra": {"document_id": "f68cdf91-cd56-475d-afea-36d0845d0522", "chunks": 1, "duration_seconds": 0.2736318111419678}}
2026-03-25 19:30:36,932 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor f1a20818-cdc9-49eb-91bc-214b3753a588 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-25 19:30:36,932 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor f1a20818-cdc9-49eb-91bc-214b3753a588 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-25 19:30:36,931 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor f1a20818-cdc9-49eb-91bc-214b3753a588 Starte Chunking (client)
{"extra": {"document_id": "f68cdf91-cd56-475d-afea-36d0845d0522"}}
2026-03-25 19:30:36,931 INFO rag_processor.py:__init__:124 RAGProcessor f1a20818-cdc9-49eb-91bc-214b3753a588 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-25 19:30:36,819 DEBUG base_processor.py:init_logger:386 RAGProcessor f1a20818-cdc9-49eb-91bc-214b3753a588 Logger initialisiert
2026-03-25 19:30:02,022 DEBUG transformer_processor.py:__init__:199 TransformerProcessor bf83dec7-91e5-4ed6-8270-22d3a0bdff13 Transformer Init Timings
{"total_init_ms": 209.88, "super_init_ms": 109.91, "config_load_ms": 19.06, "client_init_ms": 0.02, "transcriber_init_ms": 80.55}
2026-03-25 19:30:02,021 DEBUG transformer_processor.py:__init__:179 TransformerProcessor bf83dec7-91e5-4ed6-8270-22d3a0bdff13 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 19:30:01,812 DEBUG base_processor.py:init_logger:386 TransformerProcessor bf83dec7-91e5-4ed6-8270-22d3a0bdff13 Logger initialisiert
2026-03-25 19:28:52,223 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 7aafec98-f677-424d-8e43-1257271c8549 Transformer Init Timings
{"total_init_ms": 216.78, "super_init_ms": 120.65, "config_load_ms": 16.92, "client_init_ms": 0.01, "transcriber_init_ms": 78.85}
2026-03-25 19:28:52,223 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 7aafec98-f677-424d-8e43-1257271c8549 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 19:28:52,008 DEBUG base_processor.py:init_logger:386 TransformerProcessor 7aafec98-f677-424d-8e43-1257271c8549 Logger initialisiert
2026-03-25 19:28:43,722 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Template-Transformation abgeschlossen
{"duration_ms": 13599.88021850586, "model": "google/gemini-2.5-flash"}
2026-03-25 19:28:43,719 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260325_192843_template_transform.json", "tokens": 3880}
2026-03-25 19:28:43,718 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 LLM Request erstellt
{"purpose": "template_transform", "tokens": 3880, "duration": 13599.88021850586, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-25 19:28:43,718 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 1 LLM-Requests hinzugefügt
2026-03-25 19:28:30,118 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Sende Anfrage an LLM Provider
2026-03-25 19:28:30,117 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 7368, "estimated_text_tokens": 2870, "estimated_system_tokens": 230, "estimated_template_tokens": 318, "total_estimated_tokens": 3418}
2026-03-25 19:28:30,095 INFO transcription_utils.py:_extract_system_prompt:1762 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Kein Systemprompt im Template gefunden, verwende Standard-Prompt
2026-03-25 19:28:30,095 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Verwende direkt übergebenes Template-Inhalt (Länge: 607)
2026-03-25 19:28:30,095 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Starte Template-Transformation:
2026-03-25 19:28:30,095 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Transformer Init Timings
{"total_init_ms": 210.3, "super_init_ms": 107.03, "config_load_ms": 17.54, "client_init_ms": 0.01, "transcriber_init_ms": 85.42}
2026-03-25 19:28:30,095 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 19:28:29,885 DEBUG base_processor.py:init_logger:386 TransformerProcessor 45a95ba4-03fe-4f55-a704-ddf8d674cb28 Logger initialisiert
2026-03-25 17:08:55,421 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 7dd7067d-ba5f-453c-9296-43ef94a8fb5c Transformer Init Timings
{"total_init_ms": 198.81, "super_init_ms": 100.75, "config_load_ms": 17.39, "client_init_ms": 0.01, "transcriber_init_ms": 80.29}
2026-03-25 17:08:55,421 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 7dd7067d-ba5f-453c-9296-43ef94a8fb5c Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 17:08:55,223 DEBUG base_processor.py:init_logger:386 TransformerProcessor 7dd7067d-ba5f-453c-9296-43ef94a8fb5c Logger initialisiert
2026-03-25 17:08:54,339 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 043d48cf-acf4-4949-8ca3-2ae00989fddb Embedding (client) abgeschlossen
{"extra": {"document_id": "1f755d82-8f06-4247-979a-c483dce08c8c", "chunks": 1, "duration_seconds": 0.2851400375366211}}
2026-03-25 17:08:54,054 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 043d48cf-acf4-4949-8ca3-2ae00989fddb Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-25 17:08:54,054 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 043d48cf-acf4-4949-8ca3-2ae00989fddb Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-25 17:08:54,053 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 043d48cf-acf4-4949-8ca3-2ae00989fddb Starte Chunking (client)
{"extra": {"document_id": "1f755d82-8f06-4247-979a-c483dce08c8c"}}
2026-03-25 17:08:54,053 INFO rag_processor.py:__init__:124 RAGProcessor 043d48cf-acf4-4949-8ca3-2ae00989fddb RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-25 17:08:53,960 DEBUG base_processor.py:init_logger:386 RAGProcessor 043d48cf-acf4-4949-8ca3-2ae00989fddb Logger initialisiert
2026-03-25 17:03:27,276 DEBUG transformer_processor.py:__init__:199 TransformerProcessor b9102be3-b55d-4fd9-b92f-17db24b99440 Transformer Init Timings
{"total_init_ms": 203.94, "super_init_ms": 86.57, "config_load_ms": 17.14, "client_init_ms": 21.63, "transcriber_init_ms": 78.18}
2026-03-25 17:03:27,275 DEBUG transformer_processor.py:__init__:179 TransformerProcessor b9102be3-b55d-4fd9-b92f-17db24b99440 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 17:03:27,072 DEBUG base_processor.py:init_logger:386 TransformerProcessor b9102be3-b55d-4fd9-b92f-17db24b99440 Logger initialisiert
2026-03-25 14:52:48,767 INFO pdf_processor.py:process_mistral_ocr_with_pages:967 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR mit Seiten: Verarbeitung abgeschlossen
{"progress": 90}
2026-03-25 14:52:48,764 INFO pdf_processor.py:process_mistral_ocr_with_pages:909 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral OCR Bilder direkt in ZIP gepackt: 1 Bilder in mistral_ocr_images_job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac.zip
2026-03-25 14:52:48,761 INFO pdf_processor.py:_extract_pdf_pages_as_images:657 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac PDF-Seiten als Bilder extrahiert: 8 Seiten, ZIP: cache/pdf/temp/pdf/145daf57314f602f/pages.zip
2026-03-25 14:52:48,639 INFO pdf_processor.py:_process_mistral_ocr:590 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: Ergebnis geparst (8 Seiten)
{"progress": 85}
2026-03-25 14:52:48,638 DEBUG pdf_processor.py:_process_mistral_ocr:564 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: OCR Response Struktur
{"response_keys": ["pages", "model", "document_annotation", "usage_info"], "pages_count": 8, "has_images": false}
2026-03-25 14:52:48,638 INFO pdf_processor.py:_process_mistral_ocr:554 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: OCR-Antwort empfangen
{"progress": 75, "status_code": 200, "response_size": 31604}
2026-03-25 14:52:48,637 DEBUG pdf_processor.py:_process_mistral_ocr:526 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: OCR Response empfangen
{"status_code": 200, "headers": {"Date": "Wed, 25 Mar 2026 14:52:48 GMT", "Content-Type": "application/json", "Transfer-Encoding": "chunked", "Connection": "keep-alive", "mistral-correlation-id": "019d257b-b41f-7853-816a-f67d7f5824f4", "x-kong-request-id": "019d257b-b41f-7853-816a-f67d7f5824f4", "x-envoy-upstream-service-time": "11318", "Server": "cloudflare", "access-control-allow-origin": "*", "x-kong-upstream-latency": "11320", "x-kong-proxy-latency": "19", "set-cookie": "__cf_bm=Jecq.nwxb7C1btO4z7kq5R52iWbczUiq3QVWDJ0EML0-1774450357.260977-1.0.1.1-jkKWvQ768jWeRmJZTWpToO2fMhVRxJU3kznnevdiehbnNwudaSZt2Zlfau99y6M6wzquxeWzZQ3phdEBSAUGA5RCRJ5ehBUVdbwZM59YVZDDH97.xztQ0GPI1BcuCho0; HttpOnly; Secure; Path=/; Domain=mistral.ai; Expires=Wed, 25 Mar 2026 15:22:48 GMT, _cfuvid=4goOP3WP_i0SFhLAsgu1jz0wk4GHtq4RKUG8gp_9Yvk-1774450357.260977-1.0.1.1-4iU9UKegtol5d4VM0ChAxps_snYgkHb.KoxK8Q2Bd2M; HttpOnly; SameSite=None; Secure; Path=/; Domain=mistral.ai", "X-Content-Type-Options": "nosniff", "cf-cache-status": "DYNAMIC", "Strict-Transport-Security": "max-age=15552000; includeSubDomains; preload", "Content-Encoding": "br", "CF-RAY": "9e1ebd8cddf13a80-FRA", "alt-svc": "h3=\":443\"; ma=86400"}, "response_size": 31604, "content_type": "application/json"}
2026-03-25 14:52:37,192 DEBUG pdf_processor.py:_process_mistral_ocr:522 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: OCR Payload
{"payload": {"model": "mistral-ocr-latest", "document": {"type": "file", "file_id": "ec321ca2-4d5f-4860-bf12-72064a6459c7"}, "include_image_base64": true}}
2026-03-25 14:52:37,191 INFO pdf_processor.py:_process_mistral_ocr:513 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: OCR-Anfrage wird gesendet
{"progress": 60, "ocr_url": "https://api.mistral.ai/v1/ocr", "model": "mistral-ocr-latest", "file_id": "ec321ca2-4d5f-4860-bf12-72064a6459c7", "pages_count": null, "include_image_base64": true}
2026-03-25 14:52:37,191 INFO pdf_processor.py:_process_mistral_ocr:483 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: Upload abgeschlossen
{"progress": 30, "file_id": "ec321ca2-4d5f-4860-bf12-72064a6459c7", "upload_response_keys": ["id", "object", "bytes", "created_at", "filename", "purpose", "sample_type", "num_lines", "mimetype", "source", "signature", "expires_at", "visibility"]}
2026-03-25 14:52:37,190 DEBUG pdf_processor.py:_process_mistral_ocr:469 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: Upload Response
{"status_code": 200, "headers": {"Date": "Wed, 25 Mar 2026 14:52:37 GMT", "Content-Type": "application/json; charset=utf-8", "Transfer-Encoding": "chunked", "Connection": "keep-alive", "mistral-correlation-id": "019d257b-b18d-7711-afb2-41df063c9386", "x-kong-request-id": "019d257b-b18d-7711-afb2-41df063c9386", "Server": "cloudflare", "x-envoy-upstream-service-time": "451", "access-control-allow-origin": "*", "x-kong-upstream-latency": "541", "x-kong-proxy-latency": "7", "cf-cache-status": "DYNAMIC", "set-cookie": "__cf_bm=2V4T_wv0Ej6eo_CAlBGyUGY0xQO4QSqJckrllwnb_Ag-1774450356.5863068-1.0.1.1-oeet5NRZPOaYtfx36NhY68FRXNJo5XgoqPnWXPS8pssAfFlU7yuiG4IXWnzcU9SJuwHjp3rXEkMcTKkYF6aNkPQn8_aZskTs5cDSoYNzjH49FBR0fVmD8ToVoFHn144E; HttpOnly; Secure; Path=/; Domain=mistral.ai; Expires=Wed, 25 Mar 2026 15:22:37 GMT, _cfuvid=Ie5dZ5W3zS0nidP2a5622KSk9AVBh6qpypbnpKsDo_Q-1774450356.5863068-1.0.1.1-KAvaW7189iHAeh9Kx8SzEu5m_9W07XCMtjPt9BWXqGk; HttpOnly; SameSite=None; Secure; Path=/; Domain=mistral.ai", "Strict-Transport-Security": "max-age=15552000; includeSubDomains; preload", "X-Content-Type-Options": "nosniff", "Content-Encoding": "br", "CF-RAY": "9e1ebd88a857dc59-FRA", "alt-svc": "h3=\":443\"; ma=86400"}, "response_size": 379}
2026-03-25 14:52:36,543 INFO pdf_processor.py:_process_mistral_ocr:456 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: Upload startet
{"progress": 10, "file_name": "upload_8c5d7741-1ba3-46c4-89cd-399b37999800.pdf", "file_size": 942877, "upload_url": "https://api.mistral.ai/v1/files"}
2026-03-25 14:52:36,543 INFO pdf_processor.py:_process_mistral_ocr:442 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Mistral-OCR: Verarbeitung startet
{"progress": 5, "file_name": "upload_8c5d7741-1ba3-46c4-89cd-399b37999800.pdf", "file_size": 942877, "page_start": null, "page_end": null, "include_ocr_images": true}
2026-03-25 14:52:36,476 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Transformer Init Timings
{"total_init_ms": 209.67, "super_init_ms": 89.28, "config_load_ms": 18.89, "client_init_ms": 21.76, "transcriber_init_ms": 79.43}
2026-03-25 14:52:36,476 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:52:36,266 DEBUG imageocr_processor.py:__init__:224 ImageOCRProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac ImageOCRProcessor initialisiert mit Konfiguration
{"max_file_size": 10485760, "max_resolution": 4096, "temp_dir": "cache/imageocr/temp", "cache_dir": "cache/imageocr"}
2026-03-25 14:52:36,164 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Transformer Init Timings
{"total_init_ms": 182.83, "super_init_ms": 85.55, "config_load_ms": 17.63, "client_init_ms": 0.01, "transcriber_init_ms": 79.35}
2026-03-25 14:52:36,164 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:52:35,981 DEBUG pdf_processor.py:__init__:236 PDFProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac PDFProcessor initialisiert mit Konfiguration
{"max_file_size": 150000000, "max_pages": 500, "temp_dir": "cache/pdf/temp", "cache_dir": "cache/pdf", "main_image_max_size": 1280, "main_image_format": "jpg", "preview_image_max_size": 360, "preview_image_format": "jpg"}
2026-03-25 14:52:35,727 DEBUG base_processor.py:init_logger:386 PDFProcessor job-cc59fb13-099f-43ee-bfe2-e75deef0f1ac Logger initialisiert
2026-03-25 14:22:33,956 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Template-Transformation abgeschlossen
{"duration_ms": 6541.769027709961, "model": "google/gemini-2.5-flash"}
2026-03-25 14:22:33,955 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260325_142233_template_transform.json", "tokens": 16744}
2026-03-25 14:22:33,954 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 LLM Request erstellt
{"purpose": "template_transform", "tokens": 16744, "duration": 6541.769027709961, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-25 14:22:33,953 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 1 LLM-Requests hinzugefügt
2026-03-25 14:22:27,411 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Sende Anfrage an LLM Provider
2026-03-25 14:22:27,411 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 22628, "estimated_text_tokens": 13040, "estimated_system_tokens": 3080, "estimated_template_tokens": 1196, "total_estimated_tokens": 17316}
2026-03-25 14:22:27,389 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-25 14:22:27,389 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-25 14:22:27,388 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Starte Template-Transformation:
2026-03-25 14:22:27,388 DEBUG transformer_processor.py:__init__:199 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Transformer Init Timings
{"total_init_ms": 231.6, "super_init_ms": 112.65, "config_load_ms": 16.91, "client_init_ms": 21.61, "transcriber_init_ms": 80.12}
2026-03-25 14:22:27,388 DEBUG transformer_processor.py:__init__:179 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:22:27,157 DEBUG base_processor.py:init_logger:386 TransformerProcessor a4552c2a-269f-4c11-9fe6-1b776b3446a6 Logger initialisiert
2026-03-25 14:22:06,150 INFO pdf_processor.py:process_mistral_ocr_with_pages:967 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR mit Seiten: Verarbeitung abgeschlossen
{"progress": 90}
2026-03-25 14:22:06,148 INFO pdf_processor.py:process_mistral_ocr_with_pages:909 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral OCR Bilder direkt in ZIP gepackt: 1 Bilder in mistral_ocr_images_job-22948fa3-8d6c-48a4-869b-44df14d9434b.zip
2026-03-25 14:22:06,147 INFO pdf_processor.py:_extract_pdf_pages_as_images:657 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b PDF-Seiten als Bilder extrahiert: 8 Seiten, ZIP: cache/pdf/temp/pdf/11f39c2f8c88f254/pages.zip
2026-03-25 14:22:06,014 INFO pdf_processor.py:_process_mistral_ocr:590 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: Ergebnis geparst (8 Seiten)
{"progress": 85}
2026-03-25 14:22:06,014 DEBUG pdf_processor.py:_process_mistral_ocr:564 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: OCR Response Struktur
{"response_keys": ["pages", "model", "document_annotation", "usage_info"], "pages_count": 8, "has_images": false}
2026-03-25 14:22:06,013 INFO pdf_processor.py:_process_mistral_ocr:554 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: OCR-Antwort empfangen
{"progress": 75, "status_code": 200, "response_size": 31604}
2026-03-25 14:22:06,013 DEBUG pdf_processor.py:_process_mistral_ocr:526 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: OCR Response empfangen
{"status_code": 200, "headers": {"Date": "Wed, 25 Mar 2026 14:22:06 GMT", "Content-Type": "application/json", "Transfer-Encoding": "chunked", "Connection": "keep-alive", "mistral-correlation-id": "019d255f-7971-7b66-ae2d-fb2aec9e5b4e", "x-kong-request-id": "019d255f-7971-7b66-ae2d-fb2aec9e5b4e", "x-envoy-upstream-service-time": "18724", "Server": "cloudflare", "access-control-allow-origin": "*", "x-kong-upstream-latency": "18726", "x-kong-proxy-latency": "11", "set-cookie": "__cf_bm=7hbnaiGvuh0JMR64AYbJWmlk84cvHbld5WxboCucomI-1774448507.2295494-1.0.1.1-zrVn9.YXqYYVPegUy4klvwezU8ktMDo8pDnmVq8Jb7meM4SjHC4sDJ0M3Ezb7HbyvKz7NOV276rvVCayXymj8niJaSRQBOlql1Gy1Vc3lqsjAaDZH7HkjIVvNIKmUqIo; HttpOnly; Secure; Path=/; Domain=mistral.ai; Expires=Wed, 25 Mar 2026 14:52:06 GMT, _cfuvid=GGVZKOI0KcCO4Jo3uhzqi4d.IClE.lNq8GJBt88enO0-1774448507.2295494-1.0.1.1-wkTciwsgNkoioF9LqGT8kTmTCbWIt21H7q7b.Eex7cA; HttpOnly; SameSite=None; Secure; Path=/; Domain=mistral.ai", "X-Content-Type-Options": "nosniff", "cf-cache-status": "DYNAMIC", "Strict-Transport-Security": "max-age=15552000; includeSubDomains; preload", "Content-Encoding": "br", "CF-RAY": "9e1e906228919189-FRA", "alt-svc": "h3=\":443\"; ma=86400"}, "response_size": 31604, "content_type": "application/json"}
2026-03-25 14:21:47,180 DEBUG pdf_processor.py:_process_mistral_ocr:522 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: OCR Payload
{"payload": {"model": "mistral-ocr-latest", "document": {"type": "file", "file_id": "ec321ca2-4d5f-4860-bf12-72064a6459c7"}, "include_image_base64": true}}
2026-03-25 14:21:47,180 INFO pdf_processor.py:_process_mistral_ocr:513 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: OCR-Anfrage wird gesendet
{"progress": 60, "ocr_url": "https://api.mistral.ai/v1/ocr", "model": "mistral-ocr-latest", "file_id": "ec321ca2-4d5f-4860-bf12-72064a6459c7", "pages_count": null, "include_image_base64": true}
2026-03-25 14:21:47,180 INFO pdf_processor.py:_process_mistral_ocr:483 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: Upload abgeschlossen
{"progress": 30, "file_id": "ec321ca2-4d5f-4860-bf12-72064a6459c7", "upload_response_keys": ["id", "object", "bytes", "created_at", "filename", "purpose", "sample_type", "num_lines", "mimetype", "source", "signature", "expires_at", "visibility"]}
2026-03-25 14:21:47,180 DEBUG pdf_processor.py:_process_mistral_ocr:469 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: Upload Response
{"status_code": 200, "headers": {"Date": "Wed, 25 Mar 2026 14:21:47 GMT", "Content-Type": "application/json; charset=utf-8", "Transfer-Encoding": "chunked", "Connection": "keep-alive", "mistral-correlation-id": "019d255f-754a-7180-bbff-fb1ea66315ad", "x-kong-request-id": "019d255f-754a-7180-bbff-fb1ea66315ad", "Server": "cloudflare", "x-envoy-upstream-service-time": "903", "access-control-allow-origin": "*", "x-kong-upstream-latency": "961", "x-kong-proxy-latency": "9", "cf-cache-status": "DYNAMIC", "set-cookie": "__cf_bm=RxWUdVFhAEVxT5ud.tX6G.pvji1hxQatPp_N9I7xjtw-1774448506.152972-1.0.1.1-0ne6ZonmKIy8TH8y4MwLT20geda7JWgCuoYLWsrfNHB.ZIVkMhg06VjZL.Yv4.zkBJ4wyiXTwONuYMKIncT1M03Sf7_faJU2zoBhHR.ca7yWBuiO9hpjq_vKlUNU7SH0; HttpOnly; Secure; Path=/; Domain=mistral.ai; Expires=Wed, 25 Mar 2026 14:51:47 GMT, _cfuvid=gudYvZanT5s9wZJwkvBfXVO262uufD7SniF156MLbxI-1774448506.152972-1.0.1.1-jtHpQDwgKxmtNQKf9Zvw9zWJbfdy7BOVcg8KInZMppg; HttpOnly; SameSite=None; Secure; Path=/; Domain=mistral.ai", "Strict-Transport-Security": "max-age=15552000; includeSubDomains; preload", "X-Content-Type-Options": "nosniff", "Content-Encoding": "br", "CF-RAY": "9e1e905b7a94dca0-FRA", "alt-svc": "h3=\":443\"; ma=86400"}, "response_size": 379}
2026-03-25 14:21:46,112 INFO pdf_processor.py:_process_mistral_ocr:456 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: Upload startet
{"progress": 10, "file_name": "upload_0283478e-a27a-4ca8-bc3a-6f3e1e6664ee.pdf", "file_size": 942877, "upload_url": "https://api.mistral.ai/v1/files"}
2026-03-25 14:21:46,112 INFO pdf_processor.py:_process_mistral_ocr:442 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Mistral-OCR: Verarbeitung startet
{"progress": 5, "file_name": "upload_0283478e-a27a-4ca8-bc3a-6f3e1e6664ee.pdf", "file_size": 942877, "page_start": null, "page_end": null, "include_ocr_images": true}
2026-03-25 14:21:46,042 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Transformer Init Timings
{"total_init_ms": 402.92, "super_init_ms": 281.58, "config_load_ms": 16.91, "client_init_ms": 22.85, "transcriber_init_ms": 81.25}
2026-03-25 14:21:46,042 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:21:45,639 DEBUG imageocr_processor.py:__init__:224 ImageOCRProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b ImageOCRProcessor initialisiert mit Konfiguration
{"max_file_size": 10485760, "max_resolution": 4096, "temp_dir": "cache/imageocr/temp", "cache_dir": "cache/imageocr"}
2026-03-25 14:21:45,531 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Transformer Init Timings
{"total_init_ms": 186.66, "super_init_ms": 87.59, "config_load_ms": 17.72, "client_init_ms": 0.01, "transcriber_init_ms": 80.98}
2026-03-25 14:21:45,530 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:21:45,344 DEBUG pdf_processor.py:__init__:236 PDFProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b PDFProcessor initialisiert mit Konfiguration
{"max_file_size": 150000000, "max_pages": 500, "temp_dir": "cache/pdf/temp", "cache_dir": "cache/pdf", "main_image_max_size": 1280, "main_image_format": "jpg", "preview_image_max_size": 360, "preview_image_format": "jpg"}
2026-03-25 14:21:45,096 DEBUG base_processor.py:init_logger:386 PDFProcessor job-22948fa3-8d6c-48a4-869b-44df14d9434b Logger initialisiert
2026-03-25 14:15:31,121 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Template-Transformation abgeschlossen
{"duration_ms": 4725.185871124268, "model": "google/gemini-2.5-flash"}
2026-03-25 14:15:31,119 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260325_141531_template_transform.json", "tokens": 5971}
2026-03-25 14:15:31,118 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 LLM Request erstellt
{"purpose": "template_transform", "tokens": 5971, "duration": 4725.185871124268, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-25 14:15:31,118 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 1 LLM-Requests hinzugefügt
2026-03-25 14:15:26,392 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Sende Anfrage an LLM Provider
2026-03-25 14:15:26,392 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 2622, "estimated_text_tokens": 1338, "estimated_system_tokens": 3080, "estimated_template_tokens": 1200, "total_estimated_tokens": 5618}
2026-03-25 14:15:26,370 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-25 14:15:26,369 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-25 14:15:26,369 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Starte Template-Transformation:
2026-03-25 14:15:26,369 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Transformer Init Timings
{"total_init_ms": 210.64, "super_init_ms": 92.78, "config_load_ms": 17.01, "client_init_ms": 21.73, "transcriber_init_ms": 78.82}
2026-03-25 14:15:26,369 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:15:26,159 DEBUG base_processor.py:init_logger:386 TransformerProcessor 841f3186-1928-4790-9621-c69aff45fb47 Logger initialisiert
2026-03-25 14:13:36,807 INFO pdf_processor.py:process_mistral_ocr_with_pages:967 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR mit Seiten: Verarbeitung abgeschlossen
{"progress": 90}
2026-03-25 14:13:36,807 INFO pdf_processor.py:process_mistral_ocr_with_pages:909 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral OCR Bilder direkt in ZIP gepackt: 2 Bilder in mistral_ocr_images_job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c.zip
2026-03-25 14:13:36,804 INFO pdf_processor.py:_extract_pdf_pages_as_images:657 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c PDF-Seiten als Bilder extrahiert: 1 Seiten, ZIP: cache/pdf/temp/pdf/b86729392efd067d/pages.zip
2026-03-25 14:13:36,696 INFO pdf_processor.py:_process_mistral_ocr:590 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: Ergebnis geparst (1 Seiten)
{"progress": 85}
2026-03-25 14:13:36,696 DEBUG pdf_processor.py:_process_mistral_ocr:564 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: OCR Response Struktur
{"response_keys": ["pages", "model", "document_annotation", "usage_info"], "pages_count": 1, "has_images": false}
2026-03-25 14:13:36,695 INFO pdf_processor.py:_process_mistral_ocr:554 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: OCR-Antwort empfangen
{"progress": 75, "status_code": 200, "response_size": 88837}
2026-03-25 14:13:36,694 DEBUG pdf_processor.py:_process_mistral_ocr:526 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: OCR Response empfangen
{"status_code": 200, "headers": {"Date": "Wed, 25 Mar 2026 14:13:36 GMT", "Content-Type": "application/json", "Transfer-Encoding": "chunked", "Connection": "keep-alive", "mistral-correlation-id": "019d2557-e377-713d-89d6-ffbfe911fe86", "x-kong-request-id": "019d2557-e377-713d-89d6-ffbfe911fe86", "x-envoy-upstream-service-time": "6551", "Server": "cloudflare", "access-control-allow-origin": "*", "x-kong-upstream-latency": "6551", "x-kong-proxy-latency": "11", "set-cookie": "__cf_bm=6qG8FI9uNDHwDMID85MD0Tc3OPNNqzw6yTY8eLZwmiw-1774448010.0835445-1.0.1.1-fp8jPU_Oe7Wk7s7v0bo3ArGscVHn3zenquSheOXKXQJnV1RFTR6goWdVUNCNKM__LgBcONlh0oHOsQaGSA.wP1TV9hNNGUADIOKdrGU_BmWvEXnSw8opJpfkxVO1jv03; HttpOnly; Secure; Path=/; Domain=mistral.ai; Expires=Wed, 25 Mar 2026 14:43:36 GMT, _cfuvid=dSrkrfR3ij8h2xU7jqj7yviWEFlHfkavxdUBQtcT4lk-1774448010.0835445-1.0.1.1-w8gbwLQjnrRhUbIkQlQm1G2pCx05g0H5iLQh_hi_ZEg; HttpOnly; SameSite=None; Secure; Path=/; Domain=mistral.ai", "X-Content-Type-Options": "nosniff", "cf-cache-status": "DYNAMIC", "Strict-Transport-Security": "max-age=15552000; includeSubDomains; preload", "Content-Encoding": "br", "CF-RAY": "9e1e843f0da0d3a8-FRA", "alt-svc": "h3=\":443\"; ma=86400"}, "response_size": 88837, "content_type": "application/json"}
2026-03-25 14:13:30,033 DEBUG pdf_processor.py:_process_mistral_ocr:522 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: OCR Payload
{"payload": {"model": "mistral-ocr-latest", "document": {"type": "file", "file_id": "09f3100c-816d-4b61-9e2e-8ee254860969"}, "include_image_base64": true}}
2026-03-25 14:13:30,033 INFO pdf_processor.py:_process_mistral_ocr:513 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: OCR-Anfrage wird gesendet
{"progress": 60, "ocr_url": "https://api.mistral.ai/v1/ocr", "model": "mistral-ocr-latest", "file_id": "09f3100c-816d-4b61-9e2e-8ee254860969", "pages_count": null, "include_image_base64": true}
2026-03-25 14:13:30,033 INFO pdf_processor.py:_process_mistral_ocr:483 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: Upload abgeschlossen
{"progress": 30, "file_id": "09f3100c-816d-4b61-9e2e-8ee254860969", "upload_response_keys": ["id", "object", "bytes", "created_at", "filename", "purpose", "sample_type", "num_lines", "mimetype", "source", "signature", "expires_at", "visibility"]}
2026-03-25 14:13:30,032 DEBUG pdf_processor.py:_process_mistral_ocr:469 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: Upload Response
{"status_code": 200, "headers": {"Date": "Wed, 25 Mar 2026 14:13:30 GMT", "Content-Type": "application/json; charset=utf-8", "Transfer-Encoding": "chunked", "Connection": "keep-alive", "mistral-correlation-id": "019d2557-e0e6-7554-aa6a-f1c9a04d1f5d", "x-kong-request-id": "019d2557-e0e6-7554-aa6a-f1c9a04d1f5d", "Server": "cloudflare", "x-envoy-upstream-service-time": "457", "access-control-allow-origin": "*", "x-kong-upstream-latency": "551", "x-kong-proxy-latency": "10", "cf-cache-status": "DYNAMIC", "set-cookie": "__cf_bm=WskbcjpGDoihtGhc3tLb2XxdMbZe2DEdDC6zxbSqu.w-1774448009.4114373-1.0.1.1-170MqXeI429XhpjpyqncEc_7_bQc3l1iSutwXBaive61uSh5NSHuVRHq5P4LR4IHudIa48YYPVdetdvKJrGpiRFpIZciSAY1bM4hHh_FlChVAslZ8rZ.zzpCB.q94RDR; HttpOnly; Secure; Path=/; Domain=mistral.ai; Expires=Wed, 25 Mar 2026 14:43:30 GMT, _cfuvid=QsBmiiF7sWJEbN3jrRPHMo6JeUvDbWESp3X4rGoFMac-1774448009.4114373-1.0.1.1-QwRESdd5_ou3EDbeQIlCDHmv.HgZXtbfG1Ust6ZqdTs; HttpOnly; SameSite=None; Secure; Path=/; Domain=mistral.ai", "Strict-Transport-Security": "max-age=15552000; includeSubDomains; preload", "X-Content-Type-Options": "nosniff", "Content-Encoding": "br", "CF-RAY": "9e1e843adff090e7-FRA", "alt-svc": "h3=\":443\"; ma=86400"}, "response_size": 380}
2026-03-25 14:13:29,373 INFO pdf_processor.py:_process_mistral_ocr:456 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: Upload startet
{"progress": 10, "file_name": "upload_8df038d4-6805-4c1e-b856-a55622edfb58.pdf", "file_size": 1977094, "upload_url": "https://api.mistral.ai/v1/files"}
2026-03-25 14:13:29,373 INFO pdf_processor.py:_process_mistral_ocr:442 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Mistral-OCR: Verarbeitung startet
{"progress": 5, "file_name": "upload_8df038d4-6805-4c1e-b856-a55622edfb58.pdf", "file_size": 1977094, "page_start": null, "page_end": null, "include_ocr_images": true}
2026-03-25 14:13:29,306 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Transformer Init Timings
{"total_init_ms": 208.3, "super_init_ms": 86.07, "config_load_ms": 17.11, "client_init_ms": 25.03, "transcriber_init_ms": 79.78}
2026-03-25 14:13:29,306 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:13:29,098 DEBUG imageocr_processor.py:__init__:224 ImageOCRProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c ImageOCRProcessor initialisiert mit Konfiguration
{"max_file_size": 10485760, "max_resolution": 4096, "temp_dir": "cache/imageocr/temp", "cache_dir": "cache/imageocr"}
2026-03-25 14:13:28,992 DEBUG transformer_processor.py:__init__:199 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Transformer Init Timings
{"total_init_ms": 183.04, "super_init_ms": 85.3, "config_load_ms": 16.74, "client_init_ms": 0.01, "transcriber_init_ms": 80.66}
2026-03-25 14:13:28,992 DEBUG transformer_processor.py:__init__:179 TransformerProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 14:13:28,809 DEBUG pdf_processor.py:__init__:236 PDFProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c PDFProcessor initialisiert mit Konfiguration
{"max_file_size": 150000000, "max_pages": 500, "temp_dir": "cache/pdf/temp", "cache_dir": "cache/pdf", "main_image_max_size": 1280, "main_image_format": "jpg", "preview_image_max_size": 360, "preview_image_format": "jpg"}
2026-03-25 14:13:28,552 DEBUG base_processor.py:init_logger:386 PDFProcessor job-2da427dc-fb99-4d7e-96e9-6a90abbdfc7c Logger initialisiert
2026-03-25 10:42:25,094 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 0331c5bb-d6ba-4354-8c5c-4a56e70cba0e Transformer Init Timings
{"total_init_ms": 192.59, "super_init_ms": 96.95, "config_load_ms": 16.81, "client_init_ms": 0.01, "transcriber_init_ms": 78.51}
2026-03-25 10:42:25,094 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 0331c5bb-d6ba-4354-8c5c-4a56e70cba0e Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 10:42:24,902 DEBUG base_processor.py:init_logger:386 TransformerProcessor 0331c5bb-d6ba-4354-8c5c-4a56e70cba0e Logger initialisiert
2026-03-25 10:42:20,132 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 5ce33767-f639-46e8-b4c2-6847be7abe51 Embedding (client) abgeschlossen
{"extra": {"document_id": "a6ca71fa-f7cd-4400-9c31-a2a64cabb8b3", "chunks": 1, "duration_seconds": 0.27097153663635254}}
2026-03-25 10:42:19,862 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 5ce33767-f639-46e8-b4c2-6847be7abe51 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-25 10:42:19,861 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 5ce33767-f639-46e8-b4c2-6847be7abe51 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-25 10:42:19,861 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 5ce33767-f639-46e8-b4c2-6847be7abe51 Starte Chunking (client)
{"extra": {"document_id": "a6ca71fa-f7cd-4400-9c31-a2a64cabb8b3"}}
2026-03-25 10:42:19,861 INFO rag_processor.py:__init__:124 RAGProcessor 5ce33767-f639-46e8-b4c2-6847be7abe51 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-25 10:42:19,751 DEBUG base_processor.py:init_logger:386 RAGProcessor 5ce33767-f639-46e8-b4c2-6847be7abe51 Logger initialisiert
2026-03-25 10:38:24,231 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 312eb80f-97a5-4e26-8a5f-a8cc42e4ce95 Transformer Init Timings
{"total_init_ms": 187.07, "super_init_ms": 92.34, "config_load_ms": 16.9, "client_init_ms": 0.01, "transcriber_init_ms": 77.53}
2026-03-25 10:38:24,231 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 312eb80f-97a5-4e26-8a5f-a8cc42e4ce95 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 10:38:24,045 DEBUG base_processor.py:init_logger:386 TransformerProcessor 312eb80f-97a5-4e26-8a5f-a8cc42e4ce95 Logger initialisiert
2026-03-25 10:37:35,080 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 3a375924-2518-458f-a903-748883dec659 Transformer Init Timings
{"total_init_ms": 192.76, "super_init_ms": 90.56, "config_load_ms": 17.9, "client_init_ms": 0.01, "transcriber_init_ms": 83.98}
2026-03-25 10:37:35,080 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 3a375924-2518-458f-a903-748883dec659 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 10:37:34,887 DEBUG base_processor.py:init_logger:386 TransformerProcessor 3a375924-2518-458f-a903-748883dec659 Logger initialisiert
2026-03-25 08:31:10,133 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 1e5bd5b6-60e8-46f7-adff-4f56d3eb8d5e Transformer Init Timings
{"total_init_ms": 191.61, "super_init_ms": 92.02, "config_load_ms": 17.09, "client_init_ms": 0.02, "transcriber_init_ms": 82.13}
2026-03-25 08:31:10,132 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 1e5bd5b6-60e8-46f7-adff-4f56d3eb8d5e Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-25 08:31:09,942 DEBUG base_processor.py:init_logger:386 TransformerProcessor 1e5bd5b6-60e8-46f7-adff-4f56d3eb8d5e Logger initialisiert
2026-03-24 16:37:54,881 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Template-Transformation abgeschlossen
{"duration_ms": 9051.43928527832, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:54,880 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163754_template_transform.json", "tokens": 6197}
2026-03-24 16:37:54,880 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 LLM Request erstellt
{"purpose": "template_transform", "tokens": 6197, "duration": 9051.43928527832, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:54,879 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 1 LLM-Requests hinzugefügt
2026-03-24 16:37:45,828 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Sende Anfrage an LLM Provider
2026-03-24 16:37:45,827 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 4070, "estimated_text_tokens": 2822, "estimated_system_tokens": 3080, "estimated_template_tokens": 1220, "total_estimated_tokens": 7122}
2026-03-24 16:37:45,805 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:37:45,804 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:37:45,804 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Starte Template-Transformation:
2026-03-24 16:37:45,804 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Transformer Init Timings
{"total_init_ms": 211.59, "super_init_ms": 106.29, "config_load_ms": 17.0, "client_init_ms": 0.02, "transcriber_init_ms": 87.89}
2026-03-24 16:37:45,804 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:37:45,593 DEBUG base_processor.py:init_logger:386 TransformerProcessor 07967054-74ef-4103-a874-4f0fdff41c49 Logger initialisiert
2026-03-24 16:37:17,746 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Template-Transformation abgeschlossen
{"duration_ms": 3817.7578449249268, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:17,745 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163717_template_transform.json", "tokens": 6094}
2026-03-24 16:37:17,743 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 LLM Request erstellt
{"purpose": "template_transform", "tokens": 6094, "duration": 3817.7578449249268, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:17,743 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 1 LLM-Requests hinzugefügt
2026-03-24 16:37:16,964 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Template-Transformation abgeschlossen
{"duration_ms": 3280.4043292999268, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:16,963 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163716_template_transform.json", "tokens": 4681}
2026-03-24 16:37:16,962 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 LLM Request erstellt
{"purpose": "template_transform", "tokens": 4681, "duration": 3280.4043292999268, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:16,962 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 1 LLM-Requests hinzugefügt
2026-03-24 16:37:15,750 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Template-Transformation abgeschlossen
{"duration_ms": 2792.086601257324, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:15,749 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163715_template_transform.json", "tokens": 5022}
2026-03-24 16:37:15,748 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f LLM Request erstellt
{"purpose": "template_transform", "tokens": 5022, "duration": 2792.086601257324, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:15,747 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f 1 LLM-Requests hinzugefügt
2026-03-24 16:37:13,925 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Sende Anfrage an LLM Provider
2026-03-24 16:37:13,925 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 2850, "estimated_text_tokens": 1628, "estimated_system_tokens": 3080, "estimated_template_tokens": 1226, "total_estimated_tokens": 5934}
2026-03-24 16:37:13,901 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:37:13,901 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:37:13,901 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Starte Template-Transformation:
2026-03-24 16:37:13,901 DEBUG transformer_processor.py:__init__:199 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Transformer Init Timings
{"total_init_ms": 193.7, "super_init_ms": 105.75, "config_load_ms": 17.42, "client_init_ms": 0.01, "transcriber_init_ms": 70.18}
2026-03-24 16:37:13,901 DEBUG transformer_processor.py:__init__:179 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:37:13,708 DEBUG base_processor.py:init_logger:386 TransformerProcessor a6b00575-6511-44a9-abd9-32b19ff2daf3 Logger initialisiert
2026-03-24 16:37:13,681 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Sende Anfrage an LLM Provider
2026-03-24 16:37:13,681 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 169, "estimated_text_tokens": 134, "estimated_system_tokens": 3080, "estimated_template_tokens": 1244, "total_estimated_tokens": 4458}
2026-03-24 16:37:13,657 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:37:13,656 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:37:13,656 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Starte Template-Transformation:
2026-03-24 16:37:13,656 DEBUG transformer_processor.py:__init__:199 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Transformer Init Timings
{"total_init_ms": 201.14, "super_init_ms": 103.58, "config_load_ms": 16.97, "client_init_ms": 0.02, "transcriber_init_ms": 80.26}
2026-03-24 16:37:13,656 DEBUG transformer_processor.py:__init__:179 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:37:13,455 DEBUG base_processor.py:init_logger:386 TransformerProcessor dd9698d9-b151-454f-8727-db2da969ad24 Logger initialisiert
2026-03-24 16:37:12,955 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Sende Anfrage an LLM Provider
2026-03-24 16:37:12,955 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 756, "estimated_text_tokens": 604, "estimated_system_tokens": 3080, "estimated_template_tokens": 1202, "total_estimated_tokens": 4886}
2026-03-24 16:37:12,932 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:37:12,932 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:37:12,931 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Starte Template-Transformation:
2026-03-24 16:37:12,931 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Transformer Init Timings
{"total_init_ms": 206.41, "super_init_ms": 103.96, "config_load_ms": 18.33, "client_init_ms": 0.01, "transcriber_init_ms": 83.7}
2026-03-24 16:37:12,931 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:37:12,725 DEBUG base_processor.py:init_logger:386 TransformerProcessor 0268adc0-8c33-4962-bfb9-54195ad1a28f Logger initialisiert
2026-03-24 16:37:07,699 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Template-Transformation abgeschlossen
{"duration_ms": 8552.122354507446, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:07,698 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163707_template_transform.json", "tokens": 5405}
2026-03-24 16:37:07,696 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a LLM Request erstellt
{"purpose": "template_transform", "tokens": 5405, "duration": 8552.122354507446, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:07,696 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a 1 LLM-Requests hinzugefügt
2026-03-24 16:37:06,431 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Template-Transformation abgeschlossen
{"duration_ms": 2799.893379211426, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:06,430 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163706_template_transform.json", "tokens": 5086}
2026-03-24 16:37:06,428 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 LLM Request erstellt
{"purpose": "template_transform", "tokens": 5086, "duration": 2799.893379211426, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:06,428 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 1 LLM-Requests hinzugefügt
2026-03-24 16:37:03,627 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Sende Anfrage an LLM Provider
2026-03-24 16:37:03,627 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 679, "estimated_text_tokens": 608, "estimated_system_tokens": 3080, "estimated_template_tokens": 1202, "total_estimated_tokens": 4890}
2026-03-24 16:37:03,599 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:37:03,599 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:37:03,599 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Starte Template-Transformation:
2026-03-24 16:37:03,598 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Transformer Init Timings
{"total_init_ms": 211.2, "super_init_ms": 110.96, "config_load_ms": 17.82, "client_init_ms": 0.01, "transcriber_init_ms": 82.08}
2026-03-24 16:37:03,598 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:37:03,388 DEBUG base_processor.py:init_logger:386 TransformerProcessor 705136e9-4fa4-48cb-8bde-491ae08b1729 Logger initialisiert
2026-03-24 16:37:02,862 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Template-Transformation abgeschlossen
{"duration_ms": 7102.603912353516, "model": "google/gemini-2.5-flash"}
2026-03-24 16:37:02,861 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163702_template_transform.json", "tokens": 4570}
2026-03-24 16:37:02,860 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 LLM Request erstellt
{"purpose": "template_transform", "tokens": 4570, "duration": 7102.603912353516, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:37:02,859 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 1 LLM-Requests hinzugefügt
2026-03-24 16:36:59,143 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Sende Anfrage an LLM Provider
2026-03-24 16:36:59,143 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 1648, "estimated_text_tokens": 1318, "estimated_system_tokens": 3080, "estimated_template_tokens": 1208, "total_estimated_tokens": 5606}
2026-03-24 16:36:59,116 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:59,115 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:59,115 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Starte Template-Transformation:
2026-03-24 16:36:59,115 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Transformer Init Timings
{"total_init_ms": 292.31, "super_init_ms": 177.69, "config_load_ms": 17.4, "client_init_ms": 0.01, "transcriber_init_ms": 96.75}
2026-03-24 16:36:59,114 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:58,823 DEBUG base_processor.py:init_logger:386 TransformerProcessor 1a1c08b2-91b4-4da5-9e43-c7219c47c26a Logger initialisiert
2026-03-24 16:36:57,721 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Template-Transformation abgeschlossen
{"duration_ms": 8059.731483459473, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:57,720 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163657_template_transform.json", "tokens": 17114}
2026-03-24 16:36:57,719 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 LLM Request erstellt
{"purpose": "template_transform", "tokens": 17114, "duration": 8059.731483459473, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:57,719 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 1 LLM-Requests hinzugefügt
2026-03-24 16:36:55,756 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Sende Anfrage an LLM Provider
2026-03-24 16:36:55,756 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 60, "estimated_text_tokens": 48, "estimated_system_tokens": 3080, "estimated_template_tokens": 1256, "total_estimated_tokens": 4384}
2026-03-24 16:36:55,733 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:55,733 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:55,732 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Starte Template-Transformation:
2026-03-24 16:36:55,732 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Transformer Init Timings
{"total_init_ms": 254.37, "super_init_ms": 132.61, "config_load_ms": 17.03, "client_init_ms": 0.01, "transcriber_init_ms": 104.29}
2026-03-24 16:36:55,732 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:55,478 DEBUG base_processor.py:init_logger:386 TransformerProcessor 9dbd9ec4-cc43-4626-9295-c74b95f796c0 Logger initialisiert
2026-03-24 16:36:50,210 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Template-Transformation abgeschlossen
{"duration_ms": 7074.095249176025, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:50,208 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163650_template_transform.json", "tokens": 13842}
2026-03-24 16:36:50,207 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 LLM Request erstellt
{"purpose": "template_transform", "tokens": 13842, "duration": 7074.095249176025, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:50,206 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 1 LLM-Requests hinzugefügt
2026-03-24 16:36:49,659 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Sende Anfrage an LLM Provider
2026-03-24 16:36:49,659 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 21168, "estimated_text_tokens": 11520, "estimated_system_tokens": 3080, "estimated_template_tokens": 1226, "total_estimated_tokens": 15826}
2026-03-24 16:36:49,637 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:49,637 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:49,636 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Starte Template-Transformation:
2026-03-24 16:36:49,636 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Transformer Init Timings
{"total_init_ms": 251.78, "super_init_ms": 153.92, "config_load_ms": 16.5, "client_init_ms": 0.01, "transcriber_init_ms": 81.04}
2026-03-24 16:36:49,636 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:49,385 DEBUG base_processor.py:init_logger:386 TransformerProcessor 21dc1862-7e06-474b-88cb-766001da1f12 Logger initialisiert
2026-03-24 16:36:43,782 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Template-Transformation abgeschlossen
{"duration_ms": 8094.255447387695, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:43,781 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163643_template_transform.json", "tokens": 10440}
2026-03-24 16:36:43,778 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 LLM Request erstellt
{"purpose": "template_transform", "tokens": 10440, "duration": 8094.255447387695, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:43,778 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 1 LLM-Requests hinzugefügt
2026-03-24 16:36:43,132 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Sende Anfrage an LLM Provider
2026-03-24 16:36:43,131 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 18387, "estimated_text_tokens": 12392, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 16704}
2026-03-24 16:36:43,108 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:43,107 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:43,107 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Starte Template-Transformation:
2026-03-24 16:36:43,107 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Transformer Init Timings
{"total_init_ms": 209.06, "super_init_ms": 105.94, "config_load_ms": 17.42, "client_init_ms": 0.01, "transcriber_init_ms": 85.31}
2026-03-24 16:36:43,107 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:42,899 DEBUG base_processor.py:init_logger:386 TransformerProcessor 2b3972df-39f5-43d6-9c77-c64ee744f864 Logger initialisiert
2026-03-24 16:36:35,683 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Sende Anfrage an LLM Provider
2026-03-24 16:36:35,683 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 11151, "estimated_text_tokens": 8616, "estimated_system_tokens": 3080, "estimated_template_tokens": 1238, "total_estimated_tokens": 12934}
2026-03-24 16:36:35,650 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:35,650 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:35,649 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Starte Template-Transformation:
2026-03-24 16:36:35,649 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Transformer Init Timings
{"total_init_ms": 291.65, "super_init_ms": 170.18, "config_load_ms": 17.01, "client_init_ms": 0.01, "transcriber_init_ms": 103.99}
2026-03-24 16:36:35,649 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:35,358 DEBUG base_processor.py:init_logger:386 TransformerProcessor 4b1a28e8-437e-42e9-8e8e-4dd0b57c2113 Logger initialisiert
2026-03-24 16:36:32,885 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Template-Transformation abgeschlossen
{"duration_ms": 3602.949380874634, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:32,884 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163632_template_transform.json", "tokens": 9472}
2026-03-24 16:36:32,883 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e LLM Request erstellt
{"purpose": "template_transform", "tokens": 9472, "duration": 3602.949380874634, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:32,883 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e 1 LLM-Requests hinzugefügt
2026-03-24 16:36:30,618 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Template-Transformation abgeschlossen
{"duration_ms": 6733.164072036743, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:30,616 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163630_template_transform.json", "tokens": 10483}
2026-03-24 16:36:30,614 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de LLM Request erstellt
{"purpose": "template_transform", "tokens": 10483, "duration": 6733.164072036743, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:30,613 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de 1 LLM-Requests hinzugefügt
2026-03-24 16:36:29,279 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Sende Anfrage an LLM Provider
2026-03-24 16:36:29,279 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 9263, "estimated_text_tokens": 7248, "estimated_system_tokens": 3080, "estimated_template_tokens": 1250, "total_estimated_tokens": 11578}
2026-03-24 16:36:29,257 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:29,257 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:29,257 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Starte Template-Transformation:
2026-03-24 16:36:29,257 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Transformer Init Timings
{"total_init_ms": 183.63, "super_init_ms": 98.93, "config_load_ms": 17.0, "client_init_ms": 0.01, "transcriber_init_ms": 67.37}
2026-03-24 16:36:29,256 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:29,074 DEBUG base_processor.py:init_logger:386 TransformerProcessor 0510f7c1-c38b-4a48-b649-5c85cdffe76e Logger initialisiert
2026-03-24 16:36:23,879 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Sende Anfrage an LLM Provider
2026-03-24 16:36:23,879 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 11235, "estimated_text_tokens": 8724, "estimated_system_tokens": 3080, "estimated_template_tokens": 1238, "total_estimated_tokens": 13042}
2026-03-24 16:36:23,853 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:23,853 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:23,853 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Starte Template-Transformation:
2026-03-24 16:36:23,853 DEBUG transformer_processor.py:__init__:199 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Transformer Init Timings
{"total_init_ms": 223.06, "super_init_ms": 106.16, "config_load_ms": 19.0, "client_init_ms": 0.01, "transcriber_init_ms": 97.55}
2026-03-24 16:36:23,852 DEBUG transformer_processor.py:__init__:179 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:23,630 DEBUG base_processor.py:init_logger:386 TransformerProcessor f9060d88-bbae-48e4-9921-75de78fe31de Logger initialisiert
2026-03-24 16:36:22,053 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Template-Transformation abgeschlossen
{"duration_ms": 8406.952619552612, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:22,051 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163622_template_transform.json", "tokens": 6948}
2026-03-24 16:36:22,051 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 LLM Request erstellt
{"purpose": "template_transform", "tokens": 6948, "duration": 8406.952619552612, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:22,050 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 1 LLM-Requests hinzugefügt
2026-03-24 16:36:13,643 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Sende Anfrage an LLM Provider
2026-03-24 16:36:13,643 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 4300, "estimated_text_tokens": 3438, "estimated_system_tokens": 3080, "estimated_template_tokens": 1238, "total_estimated_tokens": 7756}
2026-03-24 16:36:13,607 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:13,607 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:13,607 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Starte Template-Transformation:
2026-03-24 16:36:13,607 DEBUG transformer_processor.py:__init__:199 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Transformer Init Timings
{"total_init_ms": 267.14, "super_init_ms": 103.12, "config_load_ms": 34.4, "client_init_ms": 0.02, "transcriber_init_ms": 129.0}
2026-03-24 16:36:13,606 DEBUG transformer_processor.py:__init__:179 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:13,340 DEBUG base_processor.py:init_logger:386 TransformerProcessor e3f655d6-32c5-40a8-bf42-54588e627d32 Logger initialisiert
2026-03-24 16:36:13,205 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Template-Transformation abgeschlossen
{"duration_ms": 4163.496255874634, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:13,204 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163613_template_transform.json", "tokens": 7173}
2026-03-24 16:36:13,203 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b LLM Request erstellt
{"purpose": "template_transform", "tokens": 7173, "duration": 4163.496255874634, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:13,203 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b 1 LLM-Requests hinzugefügt
2026-03-24 16:36:09,039 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Sende Anfrage an LLM Provider
2026-03-24 16:36:09,039 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 4818, "estimated_text_tokens": 3884, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 8196}
2026-03-24 16:36:09,015 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:09,015 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:09,014 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Starte Template-Transformation:
2026-03-24 16:36:09,014 DEBUG transformer_processor.py:__init__:199 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Transformer Init Timings
{"total_init_ms": 186.62, "super_init_ms": 86.9, "config_load_ms": 17.21, "client_init_ms": 0.01, "transcriber_init_ms": 82.18}
2026-03-24 16:36:09,014 DEBUG transformer_processor.py:__init__:179 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:36:08,828 DEBUG base_processor.py:init_logger:386 TransformerProcessor da6d965b-ef9b-41af-aa04-324d7780510b Logger initialisiert
2026-03-24 16:36:08,402 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Template-Transformation abgeschlossen
{"duration_ms": 8358.691930770874, "model": "google/gemini-2.5-flash"}
2026-03-24 16:36:08,401 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163608_template_transform.json", "tokens": 20473}
2026-03-24 16:36:08,399 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 LLM Request erstellt
{"purpose": "template_transform", "tokens": 20473, "duration": 8358.691930770874, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:36:08,399 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 1 LLM-Requests hinzugefügt
2026-03-24 16:36:00,040 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Sende Anfrage an LLM Provider
2026-03-24 16:36:00,040 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 28423, "estimated_text_tokens": 23256, "estimated_system_tokens": 3080, "estimated_template_tokens": 1238, "total_estimated_tokens": 27574}
2026-03-24 16:36:00,007 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:36:00,007 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:36:00,006 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Starte Template-Transformation:
2026-03-24 16:36:00,006 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Transformer Init Timings
{"total_init_ms": 978.04, "super_init_ms": 506.21, "config_load_ms": 170.07, "client_init_ms": 64.91, "transcriber_init_ms": 149.21}
2026-03-24 16:36:00,006 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:59,955 ERROR transformer_processor.py:transformByTemplate:2166 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Fehler bei der Template-Transformation: LLM-Anfrage fehlgeschlagen. Provider: konfigurierter LLM-Provider, Modell: google/gemini-2.5-flash. Fehler: Chat-Completion Provider nicht verfügbar
2026-03-24 16:35:59,955 ERROR transcription_utils.py:transform_by_template:1479 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Fehler bei der Template-Transformation
2026-03-24 16:35:59,954 ERROR transcription_utils.py:transform_by_template:1244 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 LLM-Provider-Anfrage fehlgeschlagen
2026-03-24 16:35:59,954 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Sende Anfrage an LLM Provider
2026-03-24 16:35:59,954 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 7653, "estimated_text_tokens": 5960, "estimated_system_tokens": 3080, "estimated_template_tokens": 1250, "total_estimated_tokens": 10290}
2026-03-24 16:35:59,952 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:59,952 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:59,952 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Starte Template-Transformation:
2026-03-24 16:35:59,951 DEBUG transformer_processor.py:__init__:199 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Transformer Init Timings
{"total_init_ms": 831.24, "super_init_ms": 580.25, "config_load_ms": 71.49, "client_init_ms": 65.84, "transcriber_init_ms": 113.17}
2026-03-24 16:35:59,951 DEBUG transformer_processor.py:__init__:179 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:59,857 ERROR transformer_processor.py:transformByTemplate:2166 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Fehler bei der Template-Transformation: LLM-Anfrage fehlgeschlagen. Provider: konfigurierter LLM-Provider, Modell: gpt-4o-transcribe. Fehler: Chat-Completion Provider nicht verfügbar
2026-03-24 16:35:59,856 ERROR transcription_utils.py:transform_by_template:1479 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Fehler bei der Template-Transformation
2026-03-24 16:35:59,856 ERROR transcription_utils.py:transform_by_template:1244 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b LLM-Provider-Anfrage fehlgeschlagen
2026-03-24 16:35:59,856 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Sende Anfrage an LLM Provider
2026-03-24 16:35:59,856 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Token-Prüfung vor Template-Transformation
{"model": "gpt-4o-transcribe", "model_limit": 128000, "text_length": 17652, "estimated_text_tokens": 14874, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 19186}
2026-03-24 16:35:59,802 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:59,801 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:59,801 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Starte Template-Transformation:
2026-03-24 16:35:59,800 DEBUG transformer_processor.py:__init__:199 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Transformer Init Timings
{"total_init_ms": 824.88, "super_init_ms": 476.7, "config_load_ms": 97.99, "client_init_ms": 0.02, "transcriber_init_ms": 249.65}
2026-03-24 16:35:59,800 DEBUG transformer_processor.py:__init__:179 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:59,121 DEBUG base_processor.py:init_logger:386 TransformerProcessor c855a741-c03d-411d-9752-e3ff17e03b06 Logger initialisiert
2026-03-24 16:35:59,029 DEBUG base_processor.py:init_logger:386 TransformerProcessor 2c71210c-2152-4e27-b6dd-314794e4eb36 Logger initialisiert
2026-03-24 16:35:58,976 DEBUG base_processor.py:init_logger:386 TransformerProcessor dfd262d8-c599-4818-8e53-97ea9b58616b Logger initialisiert
2026-03-24 16:35:47,810 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Template-Transformation abgeschlossen
{"duration_ms": 4485.366344451904, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:47,809 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163547_template_transform.json", "tokens": 13840}
2026-03-24 16:35:47,809 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f LLM Request erstellt
{"purpose": "template_transform", "tokens": 13840, "duration": 4485.366344451904, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:47,808 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f 1 LLM-Requests hinzugefügt
2026-03-24 16:35:45,878 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Template-Transformation abgeschlossen
{"duration_ms": 9734.662294387817, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:45,877 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163545_template_transform.json", "tokens": 164711}
2026-03-24 16:35:45,871 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca LLM Request erstellt
{"purpose": "template_transform", "tokens": 164711, "duration": 9734.662294387817, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:45,870 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca 1 LLM-Requests hinzugefügt
2026-03-24 16:35:43,322 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Sende Anfrage an LLM Provider
2026-03-24 16:35:43,322 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 16349, "estimated_text_tokens": 12134, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 16446}
2026-03-24 16:35:43,290 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:43,290 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:43,290 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Starte Template-Transformation:
2026-03-24 16:35:43,290 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Transformer Init Timings
{"total_init_ms": 212.75, "super_init_ms": 114.03, "config_load_ms": 16.91, "client_init_ms": 0.01, "transcriber_init_ms": 81.44}
2026-03-24 16:35:43,289 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:43,078 DEBUG base_processor.py:init_logger:386 TransformerProcessor 51af717d-7767-4a2b-9684-a232834a565f Logger initialisiert
2026-03-24 16:35:43,073 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Template-Transformation abgeschlossen
{"duration_ms": 57155.40170669556, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:43,071 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163543_template_transform.json", "tokens": 351363}
2026-03-24 16:35:43,057 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b LLM Request erstellt
{"purpose": "template_transform", "tokens": 351363, "duration": 57155.40170669556, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:43,057 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b 1 LLM-Requests hinzugefügt
2026-03-24 16:35:36,135 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Sende Anfrage an LLM Provider
2026-03-24 16:35:36,135 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 297611, "estimated_text_tokens": 197382, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 201694}
2026-03-24 16:35:36,109 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:36,109 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:36,109 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Starte Template-Transformation:
2026-03-24 16:35:36,109 DEBUG transformer_processor.py:__init__:199 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Transformer Init Timings
{"total_init_ms": 243.12, "super_init_ms": 126.61, "config_load_ms": 16.81, "client_init_ms": 0.01, "transcriber_init_ms": 99.36}
2026-03-24 16:35:36,108 DEBUG transformer_processor.py:__init__:179 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:35,866 DEBUG base_processor.py:init_logger:386 TransformerProcessor e502f3e8-64aa-4bf2-908f-5d21355e48ca Logger initialisiert
2026-03-24 16:35:27,811 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Template-Transformation abgeschlossen
{"duration_ms": 4338.595151901245, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:27,810 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163527_template_transform.json", "tokens": 11524}
2026-03-24 16:35:27,809 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 LLM Request erstellt
{"purpose": "template_transform", "tokens": 11524, "duration": 4338.595151901245, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:27,809 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 1 LLM-Requests hinzugefügt
2026-03-24 16:35:23,470 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Sende Anfrage an LLM Provider
2026-03-24 16:35:23,470 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 12372, "estimated_text_tokens": 9608, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 13920}
2026-03-24 16:35:23,447 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:23,447 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:23,447 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Starte Template-Transformation:
2026-03-24 16:35:23,447 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Transformer Init Timings
{"total_init_ms": 207.89, "super_init_ms": 104.89, "config_load_ms": 17.9, "client_init_ms": 0.01, "transcriber_init_ms": 84.73}
2026-03-24 16:35:23,446 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:23,239 DEBUG base_processor.py:init_logger:386 TransformerProcessor 6560c755-b348-4991-9d6e-8be1788f5714 Logger initialisiert
2026-03-24 16:35:17,985 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Template-Transformation abgeschlossen
{"duration_ms": 4476.560115814209, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:17,984 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163517_template_transform.json", "tokens": 24177}
2026-03-24 16:35:17,982 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 LLM Request erstellt
{"purpose": "template_transform", "tokens": 24177, "duration": 4476.560115814209, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:17,982 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 1 LLM-Requests hinzugefügt
2026-03-24 16:35:13,505 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Sende Anfrage an LLM Provider
2026-03-24 16:35:13,505 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 35985, "estimated_text_tokens": 25856, "estimated_system_tokens": 3080, "estimated_template_tokens": 1244, "total_estimated_tokens": 30180}
2026-03-24 16:35:13,480 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:13,480 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:13,480 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Starte Template-Transformation:
2026-03-24 16:35:13,480 DEBUG transformer_processor.py:__init__:199 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Transformer Init Timings
{"total_init_ms": 201.91, "super_init_ms": 104.33, "config_load_ms": 16.68, "client_init_ms": 0.01, "transcriber_init_ms": 80.56}
2026-03-24 16:35:13,479 DEBUG transformer_processor.py:__init__:179 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:13,278 DEBUG base_processor.py:init_logger:386 TransformerProcessor ca0bfa96-5815-4f46-8e0c-09141d243619 Logger initialisiert
2026-03-24 16:35:10,679 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Template-Transformation abgeschlossen
{"duration_ms": 6393.95546913147, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:10,677 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163510_template_transform.json", "tokens": 90612}
2026-03-24 16:35:10,672 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c LLM Request erstellt
{"purpose": "template_transform", "tokens": 90612, "duration": 6393.95546913147, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:10,672 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c 1 LLM-Requests hinzugefügt
2026-03-24 16:35:04,277 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Sende Anfrage an LLM Provider
2026-03-24 16:35:04,277 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 169199, "estimated_text_tokens": 113052, "estimated_system_tokens": 3080, "estimated_template_tokens": 1208, "total_estimated_tokens": 117340}
2026-03-24 16:35:04,252 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:35:04,252 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:35:04,252 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Starte Template-Transformation:
2026-03-24 16:35:04,252 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Transformer Init Timings
{"total_init_ms": 204.29, "super_init_ms": 102.75, "config_load_ms": 18.26, "client_init_ms": 0.01, "transcriber_init_ms": 82.96}
2026-03-24 16:35:04,252 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:35:04,048 DEBUG base_processor.py:init_logger:386 TransformerProcessor 1720c29a-3061-4e7b-8003-7ed4c391ee8c Logger initialisiert
2026-03-24 16:35:00,168 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Template-Transformation abgeschlossen
{"duration_ms": 3057.093620300293, "model": "google/gemini-2.5-flash"}
2026-03-24 16:35:00,166 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163500_template_transform.json", "tokens": 17802}
2026-03-24 16:35:00,165 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 LLM Request erstellt
{"purpose": "template_transform", "tokens": 17802, "duration": 3057.093620300293, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:35:00,164 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 1 LLM-Requests hinzugefügt
2026-03-24 16:34:57,107 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Sende Anfrage an LLM Provider
2026-03-24 16:34:57,106 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 24089, "estimated_text_tokens": 18858, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 23170}
2026-03-24 16:34:57,083 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:34:57,083 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:34:57,083 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Starte Template-Transformation:
2026-03-24 16:34:57,082 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Transformer Init Timings
{"total_init_ms": 214.37, "super_init_ms": 108.71, "config_load_ms": 18.06, "client_init_ms": 0.01, "transcriber_init_ms": 87.25}
2026-03-24 16:34:57,082 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:34:56,869 DEBUG base_processor.py:init_logger:386 TransformerProcessor 73e57cef-0e45-4187-b3c5-a9e3c2ac99b3 Logger initialisiert
2026-03-24 16:34:50,538 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Template-Transformation abgeschlossen
{"duration_ms": 7002.835273742676, "model": "google/gemini-2.5-flash"}
2026-03-24 16:34:50,534 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_163450_template_transform.json", "tokens": 8227}
2026-03-24 16:34:50,533 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 LLM Request erstellt
{"purpose": "template_transform", "tokens": 8227, "duration": 7002.835273742676, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 16:34:50,533 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 1 LLM-Requests hinzugefügt
2026-03-24 16:34:45,901 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Sende Anfrage an LLM Provider
2026-03-24 16:34:45,900 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 672382, "estimated_text_tokens": 398030, "estimated_system_tokens": 3080, "estimated_template_tokens": 1208, "total_estimated_tokens": 402318}
2026-03-24 16:34:45,871 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:34:45,870 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:34:45,870 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Starte Template-Transformation:
2026-03-24 16:34:45,870 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Transformer Init Timings
{"total_init_ms": 289.37, "super_init_ms": 167.98, "config_load_ms": 16.91, "client_init_ms": 27.88, "transcriber_init_ms": 76.32}
2026-03-24 16:34:45,870 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:34:45,821 ERROR transformer_processor.py:transformByTemplate:2166 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Fehler bei der Template-Transformation: LLM-Anfrage fehlgeschlagen. Provider: konfigurierter LLM-Provider, Modell: gpt-4o-transcribe. Fehler: Chat-Completion Provider nicht verfügbar
2026-03-24 16:34:45,821 ERROR transcription_utils.py:transform_by_template:1479 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Fehler bei der Template-Transformation
2026-03-24 16:34:45,821 ERROR transcription_utils.py:transform_by_template:1244 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 LLM-Provider-Anfrage fehlgeschlagen
2026-03-24 16:34:45,821 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Sende Anfrage an LLM Provider
2026-03-24 16:34:45,820 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Token-Prüfung vor Template-Transformation
{"model": "gpt-4o-transcribe", "model_limit": 128000, "text_length": 533123, "estimated_text_tokens": 327792, "estimated_system_tokens": 3080, "estimated_template_tokens": 1214, "total_estimated_tokens": 332086}
2026-03-24 16:34:45,771 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:34:45,771 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:34:45,771 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Starte Template-Transformation:
2026-03-24 16:34:45,770 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Transformer Init Timings
{"total_init_ms": 287.77, "super_init_ms": 118.25, "config_load_ms": 35.02, "client_init_ms": 0.02, "transcriber_init_ms": 134.1}
2026-03-24 16:34:45,770 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:34:45,581 DEBUG base_processor.py:init_logger:386 TransformerProcessor 29d00438-6502-47c1-a125-29ff0acda34b Logger initialisiert
2026-03-24 16:34:45,483 DEBUG base_processor.py:init_logger:386 TransformerProcessor 063332b1-3376-49a2-9713-e070c656ef61 Logger initialisiert
2026-03-24 16:34:43,530 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Sende Anfrage an LLM Provider
2026-03-24 16:34:43,530 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 6914, "estimated_text_tokens": 5256, "estimated_system_tokens": 3080, "estimated_template_tokens": 1232, "total_estimated_tokens": 9568}
2026-03-24 16:34:43,506 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Systemprompt aus Template extrahiert
{"prompt_length": 8462}
2026-03-24 16:34:43,506 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Verwende direkt übergebenes Template-Inhalt (Länge: 12015)
2026-03-24 16:34:43,505 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Starte Template-Transformation:
2026-03-24 16:34:43,505 DEBUG transformer_processor.py:__init__:199 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Transformer Init Timings
{"total_init_ms": 194.41, "super_init_ms": 92.52, "config_load_ms": 16.96, "client_init_ms": 0.01, "transcriber_init_ms": 84.57}
2026-03-24 16:34:43,505 DEBUG transformer_processor.py:__init__:179 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 16:34:43,311 DEBUG base_processor.py:init_logger:386 TransformerProcessor e27e9b0a-18be-4883-90d4-c5775bb90a90 Logger initialisiert
2026-03-24 13:05:50,090 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 8e12992e-50b2-45f9-8aea-174d594adc72 Transformer Init Timings
{"total_init_ms": 233.73, "super_init_ms": 123.16, "config_load_ms": 16.66, "client_init_ms": 0.01, "transcriber_init_ms": 93.59}
2026-03-24 13:05:50,090 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 8e12992e-50b2-45f9-8aea-174d594adc72 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 13:05:49,857 DEBUG base_processor.py:init_logger:386 TransformerProcessor 8e12992e-50b2-45f9-8aea-174d594adc72 Logger initialisiert
2026-03-24 13:05:30,245 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 00884bbf-90c0-4c53-a5ad-2402cf1e9a01 Transformer Init Timings
{"total_init_ms": 223.69, "super_init_ms": 116.55, "config_load_ms": 17.97, "client_init_ms": 0.01, "transcriber_init_ms": 88.74}
2026-03-24 13:05:30,244 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 00884bbf-90c0-4c53-a5ad-2402cf1e9a01 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 13:05:30,022 DEBUG base_processor.py:init_logger:386 TransformerProcessor 00884bbf-90c0-4c53-a5ad-2402cf1e9a01 Logger initialisiert
2026-03-24 13:05:29,355 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 23329652-7169-4cf2-8eb3-0690af14d3a7 Embedding (client) abgeschlossen
{"extra": {"document_id": "02d98987-1b0b-42c1-9175-83abc3cfb6b9", "chunks": 1, "duration_seconds": 0.26215219497680664}}
2026-03-24 13:05:29,093 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 23329652-7169-4cf2-8eb3-0690af14d3a7 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 13:05:29,093 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 23329652-7169-4cf2-8eb3-0690af14d3a7 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 13:05:29,092 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 23329652-7169-4cf2-8eb3-0690af14d3a7 Starte Chunking (client)
{"extra": {"document_id": "02d98987-1b0b-42c1-9175-83abc3cfb6b9"}}
2026-03-24 13:05:29,092 INFO rag_processor.py:__init__:124 RAGProcessor 23329652-7169-4cf2-8eb3-0690af14d3a7 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 13:05:28,931 DEBUG base_processor.py:init_logger:386 RAGProcessor 23329652-7169-4cf2-8eb3-0690af14d3a7 Logger initialisiert
2026-03-24 13:04:37,050 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 07a67cda-96bf-40ed-a2d2-a85a4d96effe Transformer Init Timings
{"total_init_ms": 224.42, "super_init_ms": 125.43, "config_load_ms": 16.96, "client_init_ms": 0.01, "transcriber_init_ms": 81.7}
2026-03-24 13:04:37,050 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 07a67cda-96bf-40ed-a2d2-a85a4d96effe Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 13:04:36,826 DEBUG base_processor.py:init_logger:386 TransformerProcessor 07a67cda-96bf-40ed-a2d2-a85a4d96effe Logger initialisiert
2026-03-24 13:04:36,086 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 8a629e80-2d4b-4e54-893c-6bd3cd82c05d Embedding (client) abgeschlossen
{"extra": {"document_id": "fb72c59d-92a4-4165-b3fa-37f1ec7de070", "chunks": 1, "duration_seconds": 0.28343915939331055}}
2026-03-24 13:04:35,803 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 8a629e80-2d4b-4e54-893c-6bd3cd82c05d Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 13:04:35,803 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 8a629e80-2d4b-4e54-893c-6bd3cd82c05d Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 13:04:35,803 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 8a629e80-2d4b-4e54-893c-6bd3cd82c05d Starte Chunking (client)
{"extra": {"document_id": "fb72c59d-92a4-4165-b3fa-37f1ec7de070"}}
2026-03-24 13:04:35,802 INFO rag_processor.py:__init__:124 RAGProcessor 8a629e80-2d4b-4e54-893c-6bd3cd82c05d RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 13:04:35,704 DEBUG base_processor.py:init_logger:386 RAGProcessor 8a629e80-2d4b-4e54-893c-6bd3cd82c05d Logger initialisiert
2026-03-24 13:03:54,397 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 6167a7c1-7de8-4feb-9627-741f385ba5cc Transformer Init Timings
{"total_init_ms": 206.17, "super_init_ms": 97.45, "config_load_ms": 17.32, "client_init_ms": 0.01, "transcriber_init_ms": 91.02}
2026-03-24 13:03:54,397 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 6167a7c1-7de8-4feb-9627-741f385ba5cc Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 13:03:54,191 DEBUG base_processor.py:init_logger:386 TransformerProcessor 6167a7c1-7de8-4feb-9627-741f385ba5cc Logger initialisiert
2026-03-24 11:40:34,969 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 0230fe93-1a51-4853-9f10-9d6742812054 Transformer Init Timings
{"total_init_ms": 233.18, "super_init_ms": 133.51, "config_load_ms": 17.2, "client_init_ms": 0.01, "transcriber_init_ms": 82.07}
2026-03-24 11:40:34,969 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 0230fe93-1a51-4853-9f10-9d6742812054 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:40:34,737 DEBUG base_processor.py:init_logger:386 TransformerProcessor 0230fe93-1a51-4853-9f10-9d6742812054 Logger initialisiert
2026-03-24 11:22:02,165 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 454dd0a0-88ff-435d-ac61-352d5d653b16 Transformer Init Timings
{"total_init_ms": 286.63, "super_init_ms": 132.5, "config_load_ms": 28.67, "client_init_ms": 0.02, "transcriber_init_ms": 124.98}
2026-03-24 11:22:02,164 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 454dd0a0-88ff-435d-ac61-352d5d653b16 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:22:01,878 DEBUG base_processor.py:init_logger:386 TransformerProcessor 454dd0a0-88ff-435d-ac61-352d5d653b16 Logger initialisiert
2026-03-24 11:21:24,596 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 60cc31e4-3fc4-4252-a016-9d0899866559 Embedding (client) abgeschlossen
{"extra": {"document_id": "2b9fffb7-d317-473c-bdc4-f6263c3537a4", "chunks": 1, "duration_seconds": 0.27420783042907715}}
2026-03-24 11:21:24,322 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 60cc31e4-3fc4-4252-a016-9d0899866559 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:21:24,322 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 60cc31e4-3fc4-4252-a016-9d0899866559 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:21:24,322 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 60cc31e4-3fc4-4252-a016-9d0899866559 Starte Chunking (client)
{"extra": {"document_id": "2b9fffb7-d317-473c-bdc4-f6263c3537a4"}}
2026-03-24 11:21:24,321 INFO rag_processor.py:__init__:124 RAGProcessor 60cc31e4-3fc4-4252-a016-9d0899866559 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:21:24,219 DEBUG base_processor.py:init_logger:386 RAGProcessor 60cc31e4-3fc4-4252-a016-9d0899866559 Logger initialisiert
2026-03-24 11:21:24,103 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 1dc76d83-0efb-496f-bd65-e47c9b460b33 Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAxIFdvcmtzaG9wIE9sZGllcy8yMDI2LTAyIFdvcmtzaG9wIE9sZGllcy5tZA==", "chunks": 5, "duration_seconds": 0.3445618152618408}}
2026-03-24 11:21:23,758 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 1dc76d83-0efb-496f-bd65-e47c9b460b33 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 5, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:21:23,758 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 1dc76d83-0efb-496f-bd65-e47c9b460b33 Chunking abgeschlossen (client): 5 Chunks erstellt
2026-03-24 11:21:23,758 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 1dc76d83-0efb-496f-bd65-e47c9b460b33 Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAxIFdvcmtzaG9wIE9sZGllcy8yMDI2LTAyIFdvcmtzaG9wIE9sZGllcy5tZA=="}}
2026-03-24 11:21:23,757 INFO rag_processor.py:__init__:124 RAGProcessor 1dc76d83-0efb-496f-bd65-e47c9b460b33 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:21:23,658 DEBUG base_processor.py:init_logger:386 RAGProcessor 1dc76d83-0efb-496f-bd65-e47c9b460b33 Logger initialisiert
2026-03-24 11:20:10,429 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Template-Transformation abgeschlossen
{"duration_ms": 33357.58852958679, "model": "google/gemini-3-pro-preview"}
2026-03-24 11:20:10,427 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_112010_template_transform.json", "tokens": 7780}
2026-03-24 11:20:10,426 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e LLM Request erstellt
{"purpose": "template_transform", "tokens": 7780, "duration": 33357.58852958679, "model": "google/gemini-3-pro-preview", "processor": "WhisperTranscriber"}
2026-03-24 11:20:10,426 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e 1 LLM-Requests hinzugefügt
2026-03-24 11:19:37,067 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Sende Anfrage an LLM Provider
2026-03-24 11:19:37,067 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 10367, "estimated_text_tokens": 3576, "estimated_system_tokens": 966, "estimated_template_tokens": 1254, "total_estimated_tokens": 5796}
2026-03-24 11:19:37,043 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:19:37,043 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:19:37,043 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Starte Template-Transformation:
2026-03-24 11:19:37,043 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Transformer Init Timings
{"total_init_ms": 212.1, "super_init_ms": 106.5, "config_load_ms": 18.08, "client_init_ms": 0.01, "transcriber_init_ms": 87.17}
2026-03-24 11:19:37,042 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:19:36,831 DEBUG base_processor.py:init_logger:386 TransformerProcessor 7433bf47-e1a2-42b1-a688-b305ad0d981e Logger initialisiert
2026-03-24 11:19:16,222 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Template-Transformation abgeschlossen
{"duration_ms": 34485.223054885864, "model": "google/gemini-3-pro-preview"}
2026-03-24 11:19:16,221 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_111916_template_transform.json", "tokens": 7873}
2026-03-24 11:19:16,220 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 LLM Request erstellt
{"purpose": "template_transform", "tokens": 7873, "duration": 34485.223054885864, "model": "google/gemini-3-pro-preview", "processor": "WhisperTranscriber"}
2026-03-24 11:19:16,220 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 1 LLM-Requests hinzugefügt
2026-03-24 11:18:41,734 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Sende Anfrage an LLM Provider
2026-03-24 11:18:41,734 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 10367, "estimated_text_tokens": 3576, "estimated_system_tokens": 966, "estimated_template_tokens": 1254, "total_estimated_tokens": 5796}
2026-03-24 11:18:41,701 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:18:41,701 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:18:41,701 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Starte Template-Transformation:
2026-03-24 11:18:41,700 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Transformer Init Timings
{"total_init_ms": 262.33, "super_init_ms": 128.59, "config_load_ms": 17.71, "client_init_ms": 0.01, "transcriber_init_ms": 115.59}
2026-03-24 11:18:41,700 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:18:41,439 DEBUG base_processor.py:init_logger:386 TransformerProcessor 99c7a00d-8c90-4da6-8e3c-7369e3499af2 Logger initialisiert
2026-03-24 11:15:32,429 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Template-Transformation abgeschlossen
{"duration_ms": 32935.17827987671, "model": "google/gemini-3-pro-preview"}
2026-03-24 11:15:32,427 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_111532_template_transform.json", "tokens": 8024}
2026-03-24 11:15:32,426 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 LLM Request erstellt
{"purpose": "template_transform", "tokens": 8024, "duration": 32935.17827987671, "model": "google/gemini-3-pro-preview", "processor": "WhisperTranscriber"}
2026-03-24 11:15:32,426 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 1 LLM-Requests hinzugefügt
2026-03-24 11:15:13,750 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor a0c2e6f6-878c-4563-956a-e6fe13971067 Embedding (client) abgeschlossen
{"extra": {"document_id": "a90e0b2f-75b8-40ea-b86b-dd0896c76657", "chunks": 1, "duration_seconds": 0.287656307220459}}
2026-03-24 11:15:13,463 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor a0c2e6f6-878c-4563-956a-e6fe13971067 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:15:13,463 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor a0c2e6f6-878c-4563-956a-e6fe13971067 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:15:13,462 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor a0c2e6f6-878c-4563-956a-e6fe13971067 Starte Chunking (client)
{"extra": {"document_id": "a90e0b2f-75b8-40ea-b86b-dd0896c76657"}}
2026-03-24 11:15:13,462 INFO rag_processor.py:__init__:124 RAGProcessor a0c2e6f6-878c-4563-956a-e6fe13971067 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:15:13,324 DEBUG base_processor.py:init_logger:386 RAGProcessor a0c2e6f6-878c-4563-956a-e6fe13971067 Logger initialisiert
2026-03-24 11:15:13,215 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor aedeaf1f-732b-4fbd-858f-055b2c86f3d0 Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETULG6IMB76VEZVD3W2BD327ETXHT", "chunks": 6, "duration_seconds": 0.6703486442565918}}
2026-03-24 11:15:12,545 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor aedeaf1f-732b-4fbd-858f-055b2c86f3d0 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 6, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:15:12,545 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor aedeaf1f-732b-4fbd-858f-055b2c86f3d0 Chunking abgeschlossen (client): 6 Chunks erstellt
2026-03-24 11:15:12,545 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor aedeaf1f-732b-4fbd-858f-055b2c86f3d0 Starte Chunking (client)
{"extra": {"document_id": "01XERETULG6IMB76VEZVD3W2BD327ETXHT"}}
2026-03-24 11:15:12,544 INFO rag_processor.py:__init__:124 RAGProcessor aedeaf1f-732b-4fbd-858f-055b2c86f3d0 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:15:12,447 DEBUG base_processor.py:init_logger:386 RAGProcessor aedeaf1f-732b-4fbd-858f-055b2c86f3d0 Logger initialisiert
2026-03-24 11:14:59,490 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Sende Anfrage an LLM Provider
2026-03-24 11:14:59,490 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 10367, "estimated_text_tokens": 3576, "estimated_system_tokens": 966, "estimated_template_tokens": 1254, "total_estimated_tokens": 5796}
2026-03-24 11:14:59,461 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:14:59,461 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:14:59,461 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Starte Template-Transformation:
2026-03-24 11:14:59,461 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Transformer Init Timings
{"total_init_ms": 264.58, "super_init_ms": 153.18, "config_load_ms": 16.86, "client_init_ms": 0.01, "transcriber_init_ms": 94.07}
2026-03-24 11:14:59,460 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:14:59,197 DEBUG base_processor.py:init_logger:386 TransformerProcessor 7df5cb7a-4089-4932-baef-5c8414697774 Logger initialisiert
2026-03-24 11:14:04,073 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 3a091065-27de-4f28-a30c-20414fdcef9c Embedding (client) abgeschlossen
{"extra": {"document_id": "8caff1cc-4234-4c8b-bea6-dd8fb05e722a", "chunks": 1, "duration_seconds": 0.33763909339904785}}
2026-03-24 11:14:03,736 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 3a091065-27de-4f28-a30c-20414fdcef9c Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:14:03,736 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 3a091065-27de-4f28-a30c-20414fdcef9c Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:14:03,736 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 3a091065-27de-4f28-a30c-20414fdcef9c Starte Chunking (client)
{"extra": {"document_id": "8caff1cc-4234-4c8b-bea6-dd8fb05e722a"}}
2026-03-24 11:14:03,735 INFO rag_processor.py:__init__:124 RAGProcessor 3a091065-27de-4f28-a30c-20414fdcef9c RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:14:03,627 DEBUG base_processor.py:init_logger:386 RAGProcessor 3a091065-27de-4f28-a30c-20414fdcef9c Logger initialisiert
2026-03-24 11:14:03,447 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor bbb0c3e0-b97a-4c2e-b784-e4cbf57283c1 Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETUPJ5CM4DRY2HRFLCI74ZOQV43WD", "chunks": 16, "duration_seconds": 0.5499463081359863}}
2026-03-24 11:14:02,897 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor bbb0c3e0-b97a-4c2e-b784-e4cbf57283c1 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 16, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:14:02,897 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor bbb0c3e0-b97a-4c2e-b784-e4cbf57283c1 Chunking abgeschlossen (client): 16 Chunks erstellt
2026-03-24 11:14:02,897 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor bbb0c3e0-b97a-4c2e-b784-e4cbf57283c1 Starte Chunking (client)
{"extra": {"document_id": "01XERETUPJ5CM4DRY2HRFLCI74ZOQV43WD"}}
2026-03-24 11:14:02,896 INFO rag_processor.py:__init__:124 RAGProcessor bbb0c3e0-b97a-4c2e-b784-e4cbf57283c1 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:14:02,789 DEBUG base_processor.py:init_logger:386 RAGProcessor bbb0c3e0-b97a-4c2e-b784-e4cbf57283c1 Logger initialisiert
2026-03-24 11:13:27,773 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor b69af346-1100-4cf1-98bf-6e109611b05c Embedding (client) abgeschlossen
{"extra": {"document_id": "0d38c21a-96ae-4ba1-b952-b53aa82e6b51", "chunks": 1, "duration_seconds": 0.27263736724853516}}
2026-03-24 11:13:27,501 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor b69af346-1100-4cf1-98bf-6e109611b05c Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:13:27,501 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor b69af346-1100-4cf1-98bf-6e109611b05c Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:13:27,500 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor b69af346-1100-4cf1-98bf-6e109611b05c Starte Chunking (client)
{"extra": {"document_id": "0d38c21a-96ae-4ba1-b952-b53aa82e6b51"}}
2026-03-24 11:13:27,500 INFO rag_processor.py:__init__:124 RAGProcessor b69af346-1100-4cf1-98bf-6e109611b05c RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:13:27,223 DEBUG base_processor.py:init_logger:386 RAGProcessor b69af346-1100-4cf1-98bf-6e109611b05c Logger initialisiert
2026-03-24 11:13:27,115 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 7dacc08b-18a1-4bc3-bf26-c259a63539b6 Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETUJWKRVM3K2DFNAKLCDLQHB4YWOZ", "chunks": 7, "duration_seconds": 0.49013257026672363}}
2026-03-24 11:13:26,625 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 7dacc08b-18a1-4bc3-bf26-c259a63539b6 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 7, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:13:26,625 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 7dacc08b-18a1-4bc3-bf26-c259a63539b6 Chunking abgeschlossen (client): 7 Chunks erstellt
2026-03-24 11:13:26,625 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 7dacc08b-18a1-4bc3-bf26-c259a63539b6 Starte Chunking (client)
{"extra": {"document_id": "01XERETUJWKRVM3K2DFNAKLCDLQHB4YWOZ"}}
2026-03-24 11:13:26,624 INFO rag_processor.py:__init__:124 RAGProcessor 7dacc08b-18a1-4bc3-bf26-c259a63539b6 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:13:26,524 DEBUG base_processor.py:init_logger:386 RAGProcessor 7dacc08b-18a1-4bc3-bf26-c259a63539b6 Logger initialisiert
2026-03-24 11:12:39,170 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 6f3c72b6-1ae4-4d62-baea-ca4bbbb35719 Embedding (client) abgeschlossen
{"extra": {"document_id": "d8705ede-f1f1-4549-b4bd-0e58a070c599", "chunks": 1, "duration_seconds": 0.5594875812530518}}
2026-03-24 11:12:38,611 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 6f3c72b6-1ae4-4d62-baea-ca4bbbb35719 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:12:38,611 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 6f3c72b6-1ae4-4d62-baea-ca4bbbb35719 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:12:38,610 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 6f3c72b6-1ae4-4d62-baea-ca4bbbb35719 Starte Chunking (client)
{"extra": {"document_id": "d8705ede-f1f1-4549-b4bd-0e58a070c599"}}
2026-03-24 11:12:38,610 INFO rag_processor.py:__init__:124 RAGProcessor 6f3c72b6-1ae4-4d62-baea-ca4bbbb35719 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:12:38,451 DEBUG base_processor.py:init_logger:386 RAGProcessor 6f3c72b6-1ae4-4d62-baea-ca4bbbb35719 Logger initialisiert
2026-03-24 11:12:38,291 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 90cf5695-7935-4174-a7bb-255194fe49f3 Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETUPHI3PNKVG5VBHIF4YSUR5GKFJJ", "chunks": 10, "duration_seconds": 0.4153444766998291}}
2026-03-24 11:12:37,876 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 90cf5695-7935-4174-a7bb-255194fe49f3 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 10, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:12:37,876 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 90cf5695-7935-4174-a7bb-255194fe49f3 Chunking abgeschlossen (client): 10 Chunks erstellt
2026-03-24 11:12:37,875 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 90cf5695-7935-4174-a7bb-255194fe49f3 Starte Chunking (client)
{"extra": {"document_id": "01XERETUPHI3PNKVG5VBHIF4YSUR5GKFJJ"}}
2026-03-24 11:12:37,875 INFO rag_processor.py:__init__:124 RAGProcessor 90cf5695-7935-4174-a7bb-255194fe49f3 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:12:37,742 DEBUG base_processor.py:init_logger:386 RAGProcessor 90cf5695-7935-4174-a7bb-255194fe49f3 Logger initialisiert
2026-03-24 11:12:37,137 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 4250a485-0522-4585-a5dd-e75476b6a747 Embedding (client) abgeschlossen
{"extra": {"document_id": "573fefca-0308-470b-8d02-f78847cd5bee", "chunks": 1, "duration_seconds": 0.33070874214172363}}
2026-03-24 11:12:36,806 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 4250a485-0522-4585-a5dd-e75476b6a747 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:12:36,806 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 4250a485-0522-4585-a5dd-e75476b6a747 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:12:36,806 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 4250a485-0522-4585-a5dd-e75476b6a747 Starte Chunking (client)
{"extra": {"document_id": "573fefca-0308-470b-8d02-f78847cd5bee"}}
2026-03-24 11:12:36,805 INFO rag_processor.py:__init__:124 RAGProcessor 4250a485-0522-4585-a5dd-e75476b6a747 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:12:36,707 DEBUG base_processor.py:init_logger:386 RAGProcessor 4250a485-0522-4585-a5dd-e75476b6a747 Logger initialisiert
2026-03-24 11:12:36,549 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor e764c0dc-d126-4c7f-b2fb-7048959075bc Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETUKH3PFJCANPYVAITKNKU5ICKPEG", "chunks": 12, "duration_seconds": 0.6034154891967773}}
2026-03-24 11:12:35,946 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor e764c0dc-d126-4c7f-b2fb-7048959075bc Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 12, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:12:35,946 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor e764c0dc-d126-4c7f-b2fb-7048959075bc Chunking abgeschlossen (client): 12 Chunks erstellt
2026-03-24 11:12:35,946 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor e764c0dc-d126-4c7f-b2fb-7048959075bc Starte Chunking (client)
{"extra": {"document_id": "01XERETUKH3PFJCANPYVAITKNKU5ICKPEG"}}
2026-03-24 11:12:35,945 INFO rag_processor.py:__init__:124 RAGProcessor e764c0dc-d126-4c7f-b2fb-7048959075bc RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:12:35,815 DEBUG base_processor.py:init_logger:386 RAGProcessor e764c0dc-d126-4c7f-b2fb-7048959075bc Logger initialisiert
2026-03-24 11:12:35,753 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 507eaed3-dbb3-4c92-8548-e6804688fb42 Embedding (client) abgeschlossen
{"extra": {"document_id": "f108a8cd-ffea-4270-8b72-c5156ce05748", "chunks": 1, "duration_seconds": 0.3651413917541504}}
2026-03-24 11:12:35,388 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 507eaed3-dbb3-4c92-8548-e6804688fb42 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:12:35,388 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 507eaed3-dbb3-4c92-8548-e6804688fb42 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:12:35,388 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 507eaed3-dbb3-4c92-8548-e6804688fb42 Starte Chunking (client)
{"extra": {"document_id": "f108a8cd-ffea-4270-8b72-c5156ce05748"}}
2026-03-24 11:12:35,388 INFO rag_processor.py:__init__:124 RAGProcessor 507eaed3-dbb3-4c92-8548-e6804688fb42 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:12:35,258 DEBUG base_processor.py:init_logger:386 RAGProcessor 507eaed3-dbb3-4c92-8548-e6804688fb42 Logger initialisiert
2026-03-24 11:12:35,090 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor fe2ad35c-fa8a-49ea-89c9-ae925eddc3ce Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETUIJ55LVI6ZKBFAZERHTWPSB7MMS", "chunks": 11, "duration_seconds": 0.7001423835754395}}
2026-03-24 11:12:34,390 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor fe2ad35c-fa8a-49ea-89c9-ae925eddc3ce Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 11, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:12:34,390 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor fe2ad35c-fa8a-49ea-89c9-ae925eddc3ce Chunking abgeschlossen (client): 11 Chunks erstellt
2026-03-24 11:12:34,389 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor fe2ad35c-fa8a-49ea-89c9-ae925eddc3ce Starte Chunking (client)
{"extra": {"document_id": "01XERETUIJ55LVI6ZKBFAZERHTWPSB7MMS"}}
2026-03-24 11:12:34,389 INFO rag_processor.py:__init__:124 RAGProcessor fe2ad35c-fa8a-49ea-89c9-ae925eddc3ce RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:12:34,284 DEBUG base_processor.py:init_logger:386 RAGProcessor fe2ad35c-fa8a-49ea-89c9-ae925eddc3ce Logger initialisiert
2026-03-24 11:11:37,991 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Template-Transformation abgeschlossen
{"duration_ms": 20275.29263496399, "model": "google/gemini-2.5-flash"}
2026-03-24 11:11:37,989 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_111137_template_transform.json", "tokens": 10546}
2026-03-24 11:11:37,988 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 LLM Request erstellt
{"purpose": "template_transform", "tokens": 10546, "duration": 20275.29263496399, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:11:37,988 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 1 LLM-Requests hinzugefügt
2026-03-24 11:11:17,712 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Sende Anfrage an LLM Provider
2026-03-24 11:11:17,712 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 13083, "estimated_text_tokens": 5760, "estimated_system_tokens": 1700, "estimated_template_tokens": 1518, "total_estimated_tokens": 8978}
2026-03-24 11:11:17,689 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Systemprompt aus Template extrahiert
{"prompt_length": 4502}
2026-03-24 11:11:17,688 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Verwende direkt übergebenes Template-Inhalt (Länge: 8552)
2026-03-24 11:11:17,688 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Starte Template-Transformation:
2026-03-24 11:11:17,688 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Transformer Init Timings
{"total_init_ms": 207.55, "super_init_ms": 105.36, "config_load_ms": 19.04, "client_init_ms": 0.01, "transcriber_init_ms": 82.82}
2026-03-24 11:11:17,688 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:11:17,481 DEBUG base_processor.py:init_logger:386 TransformerProcessor 69529f99-9dd3-450d-a5f8-419c85f3adc9 Logger initialisiert
2026-03-24 11:08:08,249 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 5dfb2535-110d-4b31-9377-6d922353f7b3 Embedding (client) abgeschlossen
{"extra": {"document_id": "afa761d2-4b31-4010-965e-ca0d33ae6b8c", "chunks": 1, "duration_seconds": 0.2681097984313965}}
2026-03-24 11:08:07,981 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 5dfb2535-110d-4b31-9377-6d922353f7b3 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:08:07,981 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 5dfb2535-110d-4b31-9377-6d922353f7b3 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:08:07,981 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 5dfb2535-110d-4b31-9377-6d922353f7b3 Starte Chunking (client)
{"extra": {"document_id": "afa761d2-4b31-4010-965e-ca0d33ae6b8c"}}
2026-03-24 11:08:07,981 INFO rag_processor.py:__init__:124 RAGProcessor 5dfb2535-110d-4b31-9377-6d922353f7b3 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:08:07,848 DEBUG base_processor.py:init_logger:386 RAGProcessor 5dfb2535-110d-4b31-9377-6d922353f7b3 Logger initialisiert
2026-03-24 11:08:07,686 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 6884f0f2-0282-4d57-bf5a-0dc5337db5f6 Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAxIFdvcmtzaG9wIE9sZGllcy8yMDI2LTAyIFdvcmtzaG9wIE9sZGllcy5tZA==", "chunks": 10, "duration_seconds": 0.6099324226379395}}
2026-03-24 11:08:07,077 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 6884f0f2-0282-4d57-bf5a-0dc5337db5f6 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 10, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:08:07,077 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 6884f0f2-0282-4d57-bf5a-0dc5337db5f6 Chunking abgeschlossen (client): 10 Chunks erstellt
2026-03-24 11:08:07,076 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 6884f0f2-0282-4d57-bf5a-0dc5337db5f6 Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAxIFdvcmtzaG9wIE9sZGllcy8yMDI2LTAyIFdvcmtzaG9wIE9sZGllcy5tZA=="}}
2026-03-24 11:08:07,076 INFO rag_processor.py:__init__:124 RAGProcessor 6884f0f2-0282-4d57-bf5a-0dc5337db5f6 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:08:06,974 DEBUG base_processor.py:init_logger:386 RAGProcessor 6884f0f2-0282-4d57-bf5a-0dc5337db5f6 Logger initialisiert
2026-03-24 11:08:04,383 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Template-Transformation abgeschlossen
{"duration_ms": 9736.361265182495, "model": "google/gemini-2.5-flash"}
2026-03-24 11:08:04,382 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110804_template_transform.json", "tokens": 6557}
2026-03-24 11:08:04,381 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 LLM Request erstellt
{"purpose": "template_transform", "tokens": 6557, "duration": 9736.361265182495, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:08:04,381 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 1 LLM-Requests hinzugefügt
2026-03-24 11:08:04,040 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 04614817-5d71-4adf-b26d-1022472dd7a2 Embedding (client) abgeschlossen
{"extra": {"document_id": "db7b1e0b-2380-4c3e-95b7-dc7c7bcc5cbe", "chunks": 1, "duration_seconds": 0.2566792964935303}}
2026-03-24 11:08:03,784 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 04614817-5d71-4adf-b26d-1022472dd7a2 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:08:03,784 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 04614817-5d71-4adf-b26d-1022472dd7a2 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:08:03,784 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 04614817-5d71-4adf-b26d-1022472dd7a2 Starte Chunking (client)
{"extra": {"document_id": "db7b1e0b-2380-4c3e-95b7-dc7c7bcc5cbe"}}
2026-03-24 11:08:03,783 INFO rag_processor.py:__init__:124 RAGProcessor 04614817-5d71-4adf-b26d-1022472dd7a2 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:08:03,676 DEBUG base_processor.py:init_logger:386 RAGProcessor 04614817-5d71-4adf-b26d-1022472dd7a2 Logger initialisiert
2026-03-24 11:08:03,604 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 025ea45c-e1a4-4e24-ae0d-4309e48cceba Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAyIFZlcndhaHJsb3N0ZSBCYWhuaMO2ZmUvMjAyNi0wMiBWZXJ3YWhybG9zdGUgQmFobmjDtmZlLm1k", "chunks": 3, "duration_seconds": 0.37369728088378906}}
2026-03-24 11:08:03,231 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 025ea45c-e1a4-4e24-ae0d-4309e48cceba Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 3, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:08:03,230 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 025ea45c-e1a4-4e24-ae0d-4309e48cceba Chunking abgeschlossen (client): 3 Chunks erstellt
2026-03-24 11:08:03,230 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 025ea45c-e1a4-4e24-ae0d-4309e48cceba Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAyIFZlcndhaHJsb3N0ZSBCYWhuaMO2ZmUvMjAyNi0wMiBWZXJ3YWhybG9zdGUgQmFobmjDtmZlLm1k"}}
2026-03-24 11:08:03,230 INFO rag_processor.py:__init__:124 RAGProcessor 025ea45c-e1a4-4e24-ae0d-4309e48cceba RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:08:03,098 DEBUG base_processor.py:init_logger:386 RAGProcessor 025ea45c-e1a4-4e24-ae0d-4309e48cceba Logger initialisiert
2026-03-24 11:07:59,986 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Template-Transformation abgeschlossen
{"duration_ms": 19782.96685218811, "model": "google/gemini-2.5-flash"}
2026-03-24 11:07:59,986 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110759_template_transform.json", "tokens": 2698}
2026-03-24 11:07:59,985 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 LLM Request erstellt
{"purpose": "template_transform", "tokens": 2698, "duration": 19782.96685218811, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:07:59,985 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 1 LLM-Requests hinzugefügt
2026-03-24 11:07:57,188 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 8133ec34-b9fd-48c1-a81a-1a672768de18 Embedding (client) abgeschlossen
{"extra": {"document_id": "b8fb985a-4f59-4817-a605-fdc24c9d8e7a", "chunks": 1, "duration_seconds": 0.2695167064666748}}
2026-03-24 11:07:56,918 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 8133ec34-b9fd-48c1-a81a-1a672768de18 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:56,918 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 8133ec34-b9fd-48c1-a81a-1a672768de18 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:07:56,918 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 8133ec34-b9fd-48c1-a81a-1a672768de18 Starte Chunking (client)
{"extra": {"document_id": "b8fb985a-4f59-4817-a605-fdc24c9d8e7a"}}
2026-03-24 11:07:56,918 INFO rag_processor.py:__init__:124 RAGProcessor 8133ec34-b9fd-48c1-a81a-1a672768de18 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:56,811 DEBUG base_processor.py:init_logger:386 RAGProcessor 8133ec34-b9fd-48c1-a81a-1a672768de18 Logger initialisiert
2026-03-24 11:07:56,715 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 0476c258-3785-4152-bd06-25d0aad5c1b8 Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAyIELDvG5kbmlzdGFnZSBDbGltYXRlIGFjdGlvbi8yMDI2LTAyQsO8bmRuaXN0YWdlIENsaW1hdGUgYWN0aW9uLm1k", "chunks": 4, "duration_seconds": 0.4077777862548828}}
2026-03-24 11:07:56,307 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 0476c258-3785-4152-bd06-25d0aad5c1b8 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 4, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:56,307 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 0476c258-3785-4152-bd06-25d0aad5c1b8 Chunking abgeschlossen (client): 4 Chunks erstellt
2026-03-24 11:07:56,307 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 0476c258-3785-4152-bd06-25d0aad5c1b8 Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAyIELDvG5kbmlzdGFnZSBDbGltYXRlIGFjdGlvbi8yMDI2LTAyQsO8bmRuaXN0YWdlIENsaW1hdGUgYWN0aW9uLm1k"}}
2026-03-24 11:07:56,306 INFO rag_processor.py:__init__:124 RAGProcessor 0476c258-3785-4152-bd06-25d0aad5c1b8 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:56,196 DEBUG base_processor.py:init_logger:386 RAGProcessor 0476c258-3785-4152-bd06-25d0aad5c1b8 Logger initialisiert
2026-03-24 11:07:54,644 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Sende Anfrage an LLM Provider
2026-03-24 11:07:54,644 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 10367, "estimated_text_tokens": 3576, "estimated_system_tokens": 966, "estimated_template_tokens": 1254, "total_estimated_tokens": 5796}
2026-03-24 11:07:54,621 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:07:54,621 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:07:54,621 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Starte Template-Transformation:
2026-03-24 11:07:54,621 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Transformer Init Timings
{"total_init_ms": 197.77, "super_init_ms": 99.25, "config_load_ms": 17.36, "client_init_ms": 0.01, "transcriber_init_ms": 80.86}
2026-03-24 11:07:54,621 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:07:54,613 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor b90e5210-f95d-49fb-b11c-7080135cc6dd Embedding (client) abgeschlossen
{"extra": {"document_id": "3717d41e-4891-445d-aae4-026428b0437b", "chunks": 1, "duration_seconds": 0.27434587478637695}}
2026-03-24 11:07:54,424 DEBUG base_processor.py:init_logger:386 TransformerProcessor 6625fa43-a3c2-4993-babf-41a1912751f4 Logger initialisiert
2026-03-24 11:07:54,339 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor b90e5210-f95d-49fb-b11c-7080135cc6dd Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:54,339 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor b90e5210-f95d-49fb-b11c-7080135cc6dd Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:07:54,339 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor b90e5210-f95d-49fb-b11c-7080135cc6dd Starte Chunking (client)
{"extra": {"document_id": "3717d41e-4891-445d-aae4-026428b0437b"}}
2026-03-24 11:07:54,338 INFO rag_processor.py:__init__:124 RAGProcessor b90e5210-f95d-49fb-b11c-7080135cc6dd RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:54,181 DEBUG base_processor.py:init_logger:386 RAGProcessor b90e5210-f95d-49fb-b11c-7080135cc6dd Logger initialisiert
2026-03-24 11:07:54,117 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor d444b091-88b4-45ec-ba9a-c2beb60b28df Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAyIFZlcndhaHJsb3N0ZSBCYWhuaMO2ZmUvVmVyd2Focmxvc3RlIEJhaG5ow7ZmZSBXaGF0c0FwcCBQdHQgMjAyNi0wMi0yNiBhdCAxNy4xMi4zMy5vZ2c=", "chunks": 4, "duration_seconds": 0.3970017433166504}}
2026-03-24 11:07:53,721 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor d444b091-88b4-45ec-ba9a-c2beb60b28df Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 4, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:53,720 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor d444b091-88b4-45ec-ba9a-c2beb60b28df Chunking abgeschlossen (client): 4 Chunks erstellt
2026-03-24 11:07:53,720 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor d444b091-88b4-45ec-ba9a-c2beb60b28df Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTAyIFZlcndhaHJsb3N0ZSBCYWhuaMO2ZmUvVmVyd2Focmxvc3RlIEJhaG5ow7ZmZSBXaGF0c0FwcCBQdHQgMjAyNi0wMi0yNiBhdCAxNy4xMi4zMy5vZ2c="}}
2026-03-24 11:07:53,720 INFO rag_processor.py:__init__:124 RAGProcessor d444b091-88b4-45ec-ba9a-c2beb60b28df RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:53,593 DEBUG base_processor.py:init_logger:386 RAGProcessor d444b091-88b4-45ec-ba9a-c2beb60b28df Logger initialisiert
2026-03-24 11:07:53,455 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Template-Transformation abgeschlossen
{"duration_ms": 8473.973751068115, "model": "google/gemini-2.5-flash"}
2026-03-24 11:07:53,454 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110753_template_transform.json", "tokens": 2535}
2026-03-24 11:07:53,453 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d LLM Request erstellt
{"purpose": "template_transform", "tokens": 2535, "duration": 8473.973751068115, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:07:53,452 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d 1 LLM-Requests hinzugefügt
2026-03-24 11:07:51,150 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 3e375ab8-f690-423a-af46-516eac389b8a Embedding (client) abgeschlossen
{"extra": {"document_id": "0bc0c5ae-d478-40f5-8f79-3badd917a39d", "chunks": 1, "duration_seconds": 0.43700599670410156}}
2026-03-24 11:07:50,756 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Template-Transformation abgeschlossen
{"duration_ms": 8529.494047164917, "model": "google/gemini-2.5-flash"}
2026-03-24 11:07:50,755 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110750_template_transform.json", "tokens": 2997}
2026-03-24 11:07:50,755 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b LLM Request erstellt
{"purpose": "template_transform", "tokens": 2997, "duration": 8529.494047164917, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:07:50,754 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b 1 LLM-Requests hinzugefügt
2026-03-24 11:07:50,713 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 3e375ab8-f690-423a-af46-516eac389b8a Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:50,713 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 3e375ab8-f690-423a-af46-516eac389b8a Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:07:50,713 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 3e375ab8-f690-423a-af46-516eac389b8a Starte Chunking (client)
{"extra": {"document_id": "0bc0c5ae-d478-40f5-8f79-3badd917a39d"}}
2026-03-24 11:07:50,712 INFO rag_processor.py:__init__:124 RAGProcessor 3e375ab8-f690-423a-af46-516eac389b8a RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:50,606 DEBUG base_processor.py:init_logger:386 RAGProcessor 3e375ab8-f690-423a-af46-516eac389b8a Logger initialisiert
2026-03-24 11:07:50,511 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 2f66cdfd-498a-42ee-8b9c-89bc45c22c6b Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTIgVGVtcG8gMzAgYXVzc2VyIEtvbnRyb2xsZSBpbiBCcml4ZW4vMjAyNi0wMiBUZW1wbyAzMCBhdXNzZXIgS29udHJvbGxlLm1k", "chunks": 5, "duration_seconds": 0.7842473983764648}}
2026-03-24 11:07:49,727 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 2f66cdfd-498a-42ee-8b9c-89bc45c22c6b Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 5, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:49,727 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 2f66cdfd-498a-42ee-8b9c-89bc45c22c6b Chunking abgeschlossen (client): 5 Chunks erstellt
2026-03-24 11:07:49,727 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 2f66cdfd-498a-42ee-8b9c-89bc45c22c6b Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi8yMDI2LTIgVGVtcG8gMzAgYXVzc2VyIEtvbnRyb2xsZSBpbiBCcml4ZW4vMjAyNi0wMiBUZW1wbyAzMCBhdXNzZXIgS29udHJvbGxlLm1k"}}
2026-03-24 11:07:49,726 INFO rag_processor.py:__init__:124 RAGProcessor 2f66cdfd-498a-42ee-8b9c-89bc45c22c6b RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:49,584 DEBUG base_processor.py:init_logger:386 RAGProcessor 2f66cdfd-498a-42ee-8b9c-89bc45c22c6b Logger initialisiert
2026-03-24 11:07:49,248 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 34f726c1-d188-4c49-a7fa-e61fff7436e2 Embedding (client) abgeschlossen
{"extra": {"document_id": "c2aba61f-55a6-4ee5-8d47-30795010eddd", "chunks": 1, "duration_seconds": 0.31903982162475586}}
2026-03-24 11:07:48,929 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 34f726c1-d188-4c49-a7fa-e61fff7436e2 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:48,929 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 34f726c1-d188-4c49-a7fa-e61fff7436e2 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:07:48,928 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 34f726c1-d188-4c49-a7fa-e61fff7436e2 Starte Chunking (client)
{"extra": {"document_id": "c2aba61f-55a6-4ee5-8d47-30795010eddd"}}
2026-03-24 11:07:48,928 INFO rag_processor.py:__init__:124 RAGProcessor 34f726c1-d188-4c49-a7fa-e61fff7436e2 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:48,826 DEBUG base_processor.py:init_logger:386 RAGProcessor 34f726c1-d188-4c49-a7fa-e61fff7436e2 Logger initialisiert
2026-03-24 11:07:48,727 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 590f3dc0-39a5-418e-b2b2-e9246961e6f4 Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi9WaWVsbGVpY2h0IDIwMjYtMDIgQWt0dWVsbGVzIHp1bSBnZWZvcmRlcnRlbiBLbGltYWdlc2V0ei8yMDI2LTAyIEFrdHVlbGxlcyB6dW0gZ2Vmb3JkZXJ0ZW4gS2xpbWFnZXNldHoubWQ=", "chunks": 5, "duration_seconds": 0.4957902431488037}}
2026-03-24 11:07:48,232 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 590f3dc0-39a5-418e-b2b2-e9246961e6f4 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 5, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:07:48,232 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 590f3dc0-39a5-418e-b2b2-e9246961e6f4 Chunking abgeschlossen (client): 5 Chunks erstellt
2026-03-24 11:07:48,232 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 590f3dc0-39a5-418e-b2b2-e9246961e6f4 Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyNi9WaWVsbGVpY2h0IDIwMjYtMDIgQWt0dWVsbGVzIHp1bSBnZWZvcmRlcnRlbiBLbGltYWdlc2V0ei8yMDI2LTAyIEFrdHVlbGxlcyB6dW0gZ2Vmb3JkZXJ0ZW4gS2xpbWFnZXNldHoubWQ="}}
2026-03-24 11:07:48,231 INFO rag_processor.py:__init__:124 RAGProcessor 590f3dc0-39a5-418e-b2b2-e9246961e6f4 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:07:48,105 DEBUG base_processor.py:init_logger:386 RAGProcessor 590f3dc0-39a5-418e-b2b2-e9246961e6f4 Logger initialisiert
2026-03-24 11:07:46,271 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Template-Transformation abgeschlossen
{"duration_ms": 7502.659320831299, "model": "google/gemini-2.5-flash"}
2026-03-24 11:07:46,269 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110746_template_transform.json", "tokens": 3063}
2026-03-24 11:07:46,267 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c LLM Request erstellt
{"purpose": "template_transform", "tokens": 3063, "duration": 7502.659320831299, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:07:46,267 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c 1 LLM-Requests hinzugefügt
2026-03-24 11:07:44,978 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Sende Anfrage an LLM Provider
2026-03-24 11:07:44,977 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 96, "estimated_text_tokens": 76, "estimated_system_tokens": 966, "estimated_template_tokens": 1256, "total_estimated_tokens": 2298}
2026-03-24 11:07:44,972 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Template-Transformation abgeschlossen
{"duration_ms": 8254.221677780151, "model": "google/gemini-2.5-flash"}
2026-03-24 11:07:44,971 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110744_template_transform.json", "tokens": 2923}
2026-03-24 11:07:44,969 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 LLM Request erstellt
{"purpose": "template_transform", "tokens": 2923, "duration": 8254.221677780151, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:07:44,969 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 1 LLM-Requests hinzugefügt
2026-03-24 11:07:44,954 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:07:44,954 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:07:44,954 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Starte Template-Transformation:
2026-03-24 11:07:44,954 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Transformer Init Timings
{"total_init_ms": 211.52, "super_init_ms": 106.0, "config_load_ms": 16.97, "client_init_ms": 0.01, "transcriber_init_ms": 88.12}
2026-03-24 11:07:44,953 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:07:44,743 DEBUG base_processor.py:init_logger:386 TransformerProcessor 5a042acd-f511-481e-b2f7-605f77bf005d Logger initialisiert
2026-03-24 11:07:42,224 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Sende Anfrage an LLM Provider
2026-03-24 11:07:42,224 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 1090, "estimated_text_tokens": 530, "estimated_system_tokens": 966, "estimated_template_tokens": 1278, "total_estimated_tokens": 2774}
2026-03-24 11:07:42,198 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:07:42,198 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:07:42,197 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Starte Template-Transformation:
2026-03-24 11:07:42,197 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Transformer Init Timings
{"total_init_ms": 278.03, "super_init_ms": 148.16, "config_load_ms": 17.7, "client_init_ms": 0.01, "transcriber_init_ms": 111.74}
2026-03-24 11:07:42,197 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:07:41,920 DEBUG base_processor.py:init_logger:386 TransformerProcessor 3e4effd4-1349-478b-a061-bb4b662a021b Logger initialisiert
2026-03-24 11:07:40,201 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Sende Anfrage an LLM Provider
2026-03-24 11:07:40,201 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 190, "estimated_text_tokens": 152, "estimated_system_tokens": 966, "estimated_template_tokens": 1254, "total_estimated_tokens": 2372}
2026-03-24 11:07:40,176 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:07:40,176 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:07:40,176 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Starte Template-Transformation:
2026-03-24 11:07:40,176 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Transformer Init Timings
{"total_init_ms": 213.8, "super_init_ms": 106.83, "config_load_ms": 20.89, "client_init_ms": 0.02, "transcriber_init_ms": 85.67}
2026-03-24 11:07:40,175 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:07:39,962 DEBUG base_processor.py:init_logger:386 TransformerProcessor 83299804-6e1c-48a9-95fe-b12a539832b9 Logger initialisiert
2026-03-24 11:07:38,764 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Sende Anfrage an LLM Provider
2026-03-24 11:07:38,764 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 1656, "estimated_text_tokens": 626, "estimated_system_tokens": 966, "estimated_template_tokens": 1278, "total_estimated_tokens": 2870}
2026-03-24 11:07:38,739 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:07:38,739 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:07:38,739 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Starte Template-Transformation:
2026-03-24 11:07:38,739 DEBUG transformer_processor.py:__init__:199 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Transformer Init Timings
{"total_init_ms": 204.89, "super_init_ms": 99.51, "config_load_ms": 17.29, "client_init_ms": 0.01, "transcriber_init_ms": 87.63}
2026-03-24 11:07:38,739 DEBUG transformer_processor.py:__init__:179 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:07:38,534 DEBUG base_processor.py:init_logger:386 TransformerProcessor de2bab10-cddb-4264-8768-1e53146ec58c Logger initialisiert
2026-03-24 11:07:36,714 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Sende Anfrage an LLM Provider
2026-03-24 11:07:36,714 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 775, "estimated_text_tokens": 620, "estimated_system_tokens": 966, "estimated_template_tokens": 1274, "total_estimated_tokens": 2860}
2026-03-24 11:07:36,685 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 11:07:36,685 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 11:07:36,685 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Starte Template-Transformation:
2026-03-24 11:07:36,684 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Transformer Init Timings
{"total_init_ms": 230.89, "super_init_ms": 116.75, "config_load_ms": 17.42, "client_init_ms": 0.01, "transcriber_init_ms": 96.3}
2026-03-24 11:07:36,684 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:07:36,454 DEBUG base_processor.py:init_logger:386 TransformerProcessor 459d0645-34c4-4120-87ae-9a62ac24b135 Logger initialisiert
2026-03-24 11:05:34,192 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 06f8ecce-aeba-4b95-ad86-adf0643c3f30 Embedding (client) abgeschlossen
{"extra": {"document_id": "98f57032-ceb4-44fd-9c94-a9d8122f7bc9", "chunks": 1, "duration_seconds": 0.26165771484375}}
2026-03-24 11:05:33,930 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 06f8ecce-aeba-4b95-ad86-adf0643c3f30 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:05:33,930 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 06f8ecce-aeba-4b95-ad86-adf0643c3f30 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 11:05:33,930 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 06f8ecce-aeba-4b95-ad86-adf0643c3f30 Starte Chunking (client)
{"extra": {"document_id": "98f57032-ceb4-44fd-9c94-a9d8122f7bc9"}}
2026-03-24 11:05:33,930 INFO rag_processor.py:__init__:124 RAGProcessor 06f8ecce-aeba-4b95-ad86-adf0643c3f30 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:05:33,821 DEBUG base_processor.py:init_logger:386 RAGProcessor 06f8ecce-aeba-4b95-ad86-adf0643c3f30 Logger initialisiert
2026-03-24 11:05:33,591 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 7650a311-66c5-4023-a219-d9b26da0a479 Embedding (client) abgeschlossen
{"extra": {"document_id": "01XERETUPQJITDPOX3INCKF2XPOYDDOTFB", "chunks": 16, "duration_seconds": 1.048949956893921}}
2026-03-24 11:05:32,542 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 7650a311-66c5-4023-a219-d9b26da0a479 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 16, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 11:05:32,542 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 7650a311-66c5-4023-a219-d9b26da0a479 Chunking abgeschlossen (client): 16 Chunks erstellt
2026-03-24 11:05:32,542 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 7650a311-66c5-4023-a219-d9b26da0a479 Starte Chunking (client)
{"extra": {"document_id": "01XERETUPQJITDPOX3INCKF2XPOYDDOTFB"}}
2026-03-24 11:05:32,541 INFO rag_processor.py:__init__:124 RAGProcessor 7650a311-66c5-4023-a219-d9b26da0a479 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 11:05:32,412 DEBUG base_processor.py:init_logger:386 RAGProcessor 7650a311-66c5-4023-a219-d9b26da0a479 Logger initialisiert
2026-03-24 11:04:03,977 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Template-Transformation abgeschlossen
{"duration_ms": 16262.48836517334, "model": "google/gemini-2.5-flash"}
2026-03-24 11:04:03,975 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_110403_template_transform.json", "tokens": 8976}
2026-03-24 11:04:03,975 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef LLM Request erstellt
{"purpose": "template_transform", "tokens": 8976, "duration": 16262.48836517334, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 11:04:03,974 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef 1 LLM-Requests hinzugefügt
2026-03-24 11:03:47,711 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Sende Anfrage an LLM Provider
2026-03-24 11:03:47,711 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 13134, "estimated_text_tokens": 5166, "estimated_system_tokens": 1634, "estimated_template_tokens": 1506, "total_estimated_tokens": 8306}
2026-03-24 11:03:47,688 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Systemprompt aus Template extrahiert
{"prompt_length": 4362}
2026-03-24 11:03:47,688 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Verwende direkt übergebenes Template-Inhalt (Länge: 8412)
2026-03-24 11:03:47,688 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Starte Template-Transformation:
2026-03-24 11:03:47,687 DEBUG transformer_processor.py:__init__:199 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Transformer Init Timings
{"total_init_ms": 231.54, "super_init_ms": 129.5, "config_load_ms": 18.58, "client_init_ms": 0.01, "transcriber_init_ms": 83.12}
2026-03-24 11:03:47,687 DEBUG transformer_processor.py:__init__:179 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:03:47,456 DEBUG base_processor.py:init_logger:386 TransformerProcessor bf8593ef-4907-4f2e-bb9e-031bba1a7aef Logger initialisiert
2026-03-24 11:00:08,529 DEBUG transformer_processor.py:__init__:199 TransformerProcessor b9f9842d-1d57-49da-a71a-9aeb1fc25364 Transformer Init Timings
{"total_init_ms": 244.16, "super_init_ms": 132.43, "config_load_ms": 23.39, "client_init_ms": 0.01, "transcriber_init_ms": 88.0}
2026-03-24 11:00:08,529 DEBUG transformer_processor.py:__init__:179 TransformerProcessor b9f9842d-1d57-49da-a71a-9aeb1fc25364 Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 11:00:08,285 DEBUG base_processor.py:init_logger:386 TransformerProcessor b9f9842d-1d57-49da-a71a-9aeb1fc25364 Logger initialisiert
2026-03-24 10:46:38,519 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 80faa4a8-0dcf-4212-9514-7c00c1643064 Embedding (client) abgeschlossen
{"extra": {"document_id": "ea0b0c53-5e3b-4e26-9e4c-611a532637f4", "chunks": 1, "duration_seconds": 0.26511693000793457}}
2026-03-24 10:46:38,254 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 80faa4a8-0dcf-4212-9514-7c00c1643064 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:38,254 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 80faa4a8-0dcf-4212-9514-7c00c1643064 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 10:46:38,254 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 80faa4a8-0dcf-4212-9514-7c00c1643064 Starte Chunking (client)
{"extra": {"document_id": "ea0b0c53-5e3b-4e26-9e4c-611a532637f4"}}
2026-03-24 10:46:38,254 INFO rag_processor.py:__init__:124 RAGProcessor 80faa4a8-0dcf-4212-9514-7c00c1643064 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:38,144 DEBUG base_processor.py:init_logger:386 RAGProcessor 80faa4a8-0dcf-4212-9514-7c00c1643064 Logger initialisiert
2026-03-24 10:46:37,977 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor f502141b-912c-4ef4-abe0-46415ae6a98b Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA1IFNUT1AgUGVzdGl6aWRlL0RFVV9JbmZvbWF0ZXJpYWxXYW5kZXJ1bmdTdG9wUGVzdGl6aWRlLnBkZg==", "chunks": 9, "duration_seconds": 0.5901477336883545}}
2026-03-24 10:46:37,387 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor f502141b-912c-4ef4-abe0-46415ae6a98b Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 9, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:37,387 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor f502141b-912c-4ef4-abe0-46415ae6a98b Chunking abgeschlossen (client): 9 Chunks erstellt
2026-03-24 10:46:37,386 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor f502141b-912c-4ef4-abe0-46415ae6a98b Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA1IFNUT1AgUGVzdGl6aWRlL0RFVV9JbmZvbWF0ZXJpYWxXYW5kZXJ1bmdTdG9wUGVzdGl6aWRlLnBkZg=="}}
2026-03-24 10:46:37,386 INFO rag_processor.py:__init__:124 RAGProcessor f502141b-912c-4ef4-abe0-46415ae6a98b RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:37,278 DEBUG base_processor.py:init_logger:386 RAGProcessor f502141b-912c-4ef4-abe0-46415ae6a98b Logger initialisiert
2026-03-24 10:46:33,000 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Template-Transformation abgeschlossen
{"duration_ms": 9171.80848121643, "model": "google/gemini-2.5-flash"}
2026-03-24 10:46:32,999 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_104632_template_transform.json", "tokens": 4853}
2026-03-24 10:46:32,998 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb LLM Request erstellt
{"purpose": "template_transform", "tokens": 4853, "duration": 9171.80848121643, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 10:46:32,998 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb 1 LLM-Requests hinzugefügt
2026-03-24 10:46:25,969 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 429c4021-14ac-404d-93b3-47d39c67a319 Embedding (client) abgeschlossen
{"extra": {"document_id": "4fe1de2a-acaf-412a-8527-0f9662a68ea6", "chunks": 1, "duration_seconds": 0.26428675651550293}}
2026-03-24 10:46:25,705 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 429c4021-14ac-404d-93b3-47d39c67a319 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:25,705 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 429c4021-14ac-404d-93b3-47d39c67a319 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 10:46:25,704 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 429c4021-14ac-404d-93b3-47d39c67a319 Starte Chunking (client)
{"extra": {"document_id": "4fe1de2a-acaf-412a-8527-0f9662a68ea6"}}
2026-03-24 10:46:25,704 INFO rag_processor.py:__init__:124 RAGProcessor 429c4021-14ac-404d-93b3-47d39c67a319 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:25,567 DEBUG base_processor.py:init_logger:386 RAGProcessor 429c4021-14ac-404d-93b3-47d39c67a319 Logger initialisiert
2026-03-24 10:46:25,453 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor d6f6fb0c-b66c-4c6f-aad2-16a3f0f5fd8c Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA1IFNUT1AgUGVzdGl6aWRlL0RFVV9GbHllcldhbmRlcnVuZ1N0b3BQZXN0aXppZGUucGRm", "chunks": 5, "duration_seconds": 0.3109931945800781}}
2026-03-24 10:46:25,142 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor d6f6fb0c-b66c-4c6f-aad2-16a3f0f5fd8c Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 5, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:25,142 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor d6f6fb0c-b66c-4c6f-aad2-16a3f0f5fd8c Chunking abgeschlossen (client): 5 Chunks erstellt
2026-03-24 10:46:25,142 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor d6f6fb0c-b66c-4c6f-aad2-16a3f0f5fd8c Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA1IFNUT1AgUGVzdGl6aWRlL0RFVV9GbHllcldhbmRlcnVuZ1N0b3BQZXN0aXppZGUucGRm"}}
2026-03-24 10:46:25,141 INFO rag_processor.py:__init__:124 RAGProcessor d6f6fb0c-b66c-4c6f-aad2-16a3f0f5fd8c RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:24,990 DEBUG base_processor.py:init_logger:386 RAGProcessor d6f6fb0c-b66c-4c6f-aad2-16a3f0f5fd8c Logger initialisiert
2026-03-24 10:46:23,826 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Sende Anfrage an LLM Provider
2026-03-24 10:46:23,825 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 5147, "estimated_text_tokens": 2006, "estimated_system_tokens": 966, "estimated_template_tokens": 1242, "total_estimated_tokens": 4214}
2026-03-24 10:46:23,804 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 10:46:23,803 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 10:46:23,803 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Starte Template-Transformation:
2026-03-24 10:46:23,803 DEBUG transformer_processor.py:__init__:199 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Transformer Init Timings
{"total_init_ms": 201.64, "super_init_ms": 101.22, "config_load_ms": 18.9, "client_init_ms": 0.01, "transcriber_init_ms": 81.19}
2026-03-24 10:46:23,803 DEBUG transformer_processor.py:__init__:179 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 10:46:23,602 DEBUG base_processor.py:init_logger:386 TransformerProcessor 6e65fb99-689f-4bb6-8e49-7b7df4d059eb Logger initialisiert
2026-03-24 10:46:20,628 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Template-Transformation abgeschlossen
{"duration_ms": 4819.0295696258545, "model": "google/gemini-2.5-flash"}
2026-03-24 10:46:20,627 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_104620_template_transform.json", "tokens": 3205}
2026-03-24 10:46:20,626 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae LLM Request erstellt
{"purpose": "template_transform", "tokens": 3205, "duration": 4819.0295696258545, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 10:46:20,625 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae 1 LLM-Requests hinzugefügt
2026-03-24 10:46:18,527 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 781e0430-cf7d-45f8-940e-857fc96d1ec3 Embedding (client) abgeschlossen
{"extra": {"document_id": "f7c319a4-2f45-48b0-9258-c8cbb56927bb", "chunks": 1, "duration_seconds": 0.3313724994659424}}
2026-03-24 10:46:18,196 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 781e0430-cf7d-45f8-940e-857fc96d1ec3 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:18,196 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 781e0430-cf7d-45f8-940e-857fc96d1ec3 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 10:46:18,195 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 781e0430-cf7d-45f8-940e-857fc96d1ec3 Starte Chunking (client)
{"extra": {"document_id": "f7c319a4-2f45-48b0-9258-c8cbb56927bb"}}
2026-03-24 10:46:18,195 INFO rag_processor.py:__init__:124 RAGProcessor 781e0430-cf7d-45f8-940e-857fc96d1ec3 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:18,057 DEBUG base_processor.py:init_logger:386 RAGProcessor 781e0430-cf7d-45f8-940e-857fc96d1ec3 Logger initialisiert
2026-03-24 10:46:17,979 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor bb734599-5b52-42b1-9e89-3bbd65dbdba6 Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA0IERpc3NpbmdlciBnZWdlbiBNZXNzbmVyIEF1c3NhZ2VuIHNpbmQga29udHJhcHJvZHVrL0Rpc3NpbmdlciBnZWdlbiBNZXNzbmVyIEF1c3NhZ2VuIHNpbmQga29udHJhcHJvZHVrIDIzN2ZmOTJhOGM0MDRlYjQ5YWFiNThiMjQyZjI1ZTBiLm1k", "chunks": 4, "duration_seconds": 0.3027069568634033}}
2026-03-24 10:46:17,677 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor bb734599-5b52-42b1-9e89-3bbd65dbdba6 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 4, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:17,677 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor bb734599-5b52-42b1-9e89-3bbd65dbdba6 Chunking abgeschlossen (client): 4 Chunks erstellt
2026-03-24 10:46:17,676 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor bb734599-5b52-42b1-9e89-3bbd65dbdba6 Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA0IERpc3NpbmdlciBnZWdlbiBNZXNzbmVyIEF1c3NhZ2VuIHNpbmQga29udHJhcHJvZHVrL0Rpc3NpbmdlciBnZWdlbiBNZXNzbmVyIEF1c3NhZ2VuIHNpbmQga29udHJhcHJvZHVrIDIzN2ZmOTJhOGM0MDRlYjQ5YWFiNThiMjQyZjI1ZTBiLm1k"}}
2026-03-24 10:46:17,676 INFO rag_processor.py:__init__:124 RAGProcessor bb734599-5b52-42b1-9e89-3bbd65dbdba6 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:17,576 DEBUG base_processor.py:init_logger:386 RAGProcessor bb734599-5b52-42b1-9e89-3bbd65dbdba6 Logger initialisiert
2026-03-24 10:46:16,762 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor a902296f-453d-433b-a251-2f931a408187 Embedding (client) abgeschlossen
{"extra": {"document_id": "1f2e12bb-0dcc-4cb2-aea4-00380d9cc540", "chunks": 1, "duration_seconds": 0.2981421947479248}}
2026-03-24 10:46:16,464 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor a902296f-453d-433b-a251-2f931a408187 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:16,464 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor a902296f-453d-433b-a251-2f931a408187 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 10:46:16,464 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor a902296f-453d-433b-a251-2f931a408187 Starte Chunking (client)
{"extra": {"document_id": "1f2e12bb-0dcc-4cb2-aea4-00380d9cc540"}}
2026-03-24 10:46:16,463 INFO rag_processor.py:__init__:124 RAGProcessor a902296f-453d-433b-a251-2f931a408187 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:16,364 DEBUG base_processor.py:init_logger:386 RAGProcessor a902296f-453d-433b-a251-2f931a408187 Logger initialisiert
2026-03-24 10:46:16,258 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor b829bb08-15b9-40dd-a1ad-86899942faac Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA0IEdyw7xuZHVuZyAtIFByZXNzZWtvbmZlcmVuei9HcsO8bmR1bmcgLSBQcmVzc2Vrb25mZXJlbnogNGZjZjg4N2NiY2Y5NDA2Mjg1YmQ0NmMwZGVmNmRlZmUubWQ=", "chunks": 6, "duration_seconds": 0.3346409797668457}}
2026-03-24 10:46:15,924 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor b829bb08-15b9-40dd-a1ad-86899942faac Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 6, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:15,924 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor b829bb08-15b9-40dd-a1ad-86899942faac Chunking abgeschlossen (client): 6 Chunks erstellt
2026-03-24 10:46:15,923 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor b829bb08-15b9-40dd-a1ad-86899942faac Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA0IEdyw7xuZHVuZyAtIFByZXNzZWtvbmZlcmVuei9HcsO8bmR1bmcgLSBQcmVzc2Vrb25mZXJlbnogNGZjZjg4N2NiY2Y5NDA2Mjg1YmQ0NmMwZGVmNmRlZmUubWQ="}}
2026-03-24 10:46:15,923 INFO rag_processor.py:__init__:124 RAGProcessor b829bb08-15b9-40dd-a1ad-86899942faac RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:15,837 DEBUG base_processor.py:init_logger:386 RAGProcessor b829bb08-15b9-40dd-a1ad-86899942faac Logger initialisiert
2026-03-24 10:46:15,806 INFO transcription_utils.py:transform_by_template:1187 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Sende Anfrage an LLM Provider
2026-03-24 10:46:15,805 INFO transcription_utils.py:transform_by_template:1171 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Token-Prüfung vor Template-Transformation
{"model": "google/gemini-2.5-flash", "model_limit": 8192, "text_length": 1580, "estimated_text_tokens": 644, "estimated_system_tokens": 966, "estimated_template_tokens": 1242, "total_estimated_tokens": 2852}
2026-03-24 10:46:15,783 INFO transcription_utils.py:_extract_system_prompt:1753 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Systemprompt aus Template extrahiert
{"prompt_length": 2504}
2026-03-24 10:46:15,783 INFO transcription_utils.py:transform_by_template:1042 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Verwende direkt übergebenes Template-Inhalt (Länge: 5567)
2026-03-24 10:46:15,783 INFO transcription_utils.py:transform_by_template:1027 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Starte Template-Transformation:
2026-03-24 10:46:15,783 DEBUG transformer_processor.py:__init__:199 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Transformer Init Timings
{"total_init_ms": 206.5, "super_init_ms": 94.88, "config_load_ms": 19.13, "client_init_ms": 0.01, "transcriber_init_ms": 92.16}
2026-03-24 10:46:15,783 DEBUG transformer_processor.py:__init__:179 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Transformer Processor initialisiert
{"model": "google/gemini-2.5-flash", "temperature": 0.1, "max_tokens": 4000, "target_format": "text", "max_concurrent_requests": 10, "timeout_seconds": 120, "templates_dir": "resources/templates"}
2026-03-24 10:46:15,577 DEBUG base_processor.py:init_logger:386 TransformerProcessor a479c418-4b95-495a-be82-6a53e71fc1ae Logger initialisiert
2026-03-24 10:46:14,583 INFO transcription_utils.py:transform_by_template:1461 TransformerProcessor 03104cc3-93ed-4133-94e3-54e016745059 Template-Transformation abgeschlossen
{"duration_ms": 5858.017444610596, "model": "google/gemini-2.5-flash"}
2026-03-24 10:46:14,582 DEBUG transcription_utils.py:create_llm_request:455 TransformerProcessor 03104cc3-93ed-4133-94e3-54e016745059 LLM Interaktion gespeichert
{"file": "cache/transformer/temp/debug/llm/20260324_104614_template_transform.json", "tokens": 3649}
2026-03-24 10:46:14,581 DEBUG transcription_utils.py:create_llm_request:416 TransformerProcessor 03104cc3-93ed-4133-94e3-54e016745059 LLM Request erstellt
{"purpose": "template_transform", "tokens": 3649, "duration": 5858.017444610596, "model": "google/gemini-2.5-flash", "processor": "WhisperTranscriber"}
2026-03-24 10:46:14,581 DEBUG base_processor.py:add_llm_requests:510 TransformerProcessor 03104cc3-93ed-4133-94e3-54e016745059 1 LLM-Requests hinzugefügt
2026-03-24 10:46:14,036 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor eece8307-3bcc-48d0-add2-1d506185c628 Embedding (client) abgeschlossen
{"extra": {"document_id": "a2d80a97-357b-4076-a06b-18405b93475f", "chunks": 1, "duration_seconds": 0.2913482189178467}}
2026-03-24 10:46:14,031 INFO pdf_processor.py:process_mistral_ocr_with_pages:967 TransformerProcessor job-340a0fa7-176a-4d6c-81d9-dbd464730edc Mistral-OCR mit Seiten: Verarbeitung abgeschlossen
{"progress": 90}
2026-03-24 10:46:13,745 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor eece8307-3bcc-48d0-add2-1d506185c628 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 1, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:13,745 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor eece8307-3bcc-48d0-add2-1d506185c628 Chunking abgeschlossen (client): 1 Chunks erstellt
2026-03-24 10:46:13,745 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor eece8307-3bcc-48d0-add2-1d506185c628 Starte Chunking (client)
{"extra": {"document_id": "a2d80a97-357b-4076-a06b-18405b93475f"}}
2026-03-24 10:46:13,744 INFO rag_processor.py:__init__:124 RAGProcessor eece8307-3bcc-48d0-add2-1d506185c628 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:13,654 INFO pdf_processor.py:process_mistral_ocr_with_pages:909 TransformerProcessor job-340a0fa7-176a-4d6c-81d9-dbd464730edc Mistral OCR Bilder direkt in ZIP gepackt: 4 Bilder in mistral_ocr_images_job-340a0fa7-176a-4d6c-81d9-dbd464730edc.zip
2026-03-24 10:46:13,618 DEBUG base_processor.py:init_logger:386 RAGProcessor eece8307-3bcc-48d0-add2-1d506185c628 Logger initialisiert
2026-03-24 10:46:13,520 INFO rag_processor.py:embed_document_for_client:571 RAGProcessor 9ce4c2ea-f977-46c7-8541-050a06fef853 Embedding (client) abgeschlossen
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA1IFNUT1AgUGVzdGl6aWRlL1NUT1AgUGVzdGl6aWRlIDgyOTUxYmNiNjk2MzQ2ZmE4OTk1NWExMWNhMWNjYzA2Lm1k", "chunks": 4, "duration_seconds": 0.3261275291442871}}
2026-03-24 10:46:13,251 INFO pdf_processor.py:_extract_pdf_pages_as_images:657 TransformerProcessor job-340a0fa7-176a-4d6c-81d9-dbd464730edc PDF-Seiten als Bilder extrahiert: 2 Seiten, ZIP: cache/pdf/temp/pdf/d4e8e68bb71abe16/pages.zip
2026-03-24 10:46:13,195 INFO rag_processor.py:embed_document_for_client:516 RAGProcessor 9ce4c2ea-f977-46c7-8541-050a06fef853 Starte Embedding-Generierung (client)
{"extra": {"chunk_count": 4, "embedding_model": "voyage-3-large", "embedding_dimensions": 2048}}
2026-03-24 10:46:13,194 INFO rag_processor.py:embed_document_for_client:510 RAGProcessor 9ce4c2ea-f977-46c7-8541-050a06fef853 Chunking abgeschlossen (client): 4 Chunks erstellt
2026-03-24 10:46:13,194 INFO rag_processor.py:embed_document_for_client:508 RAGProcessor 9ce4c2ea-f977-46c7-8541-050a06fef853 Starte Chunking (client)
{"extra": {"document_id": "V2Vic2VpdGUvMjAyMy8yMDIzLTA1IFNUT1AgUGVzdGl6aWRlL1NUT1AgUGVzdGl6aWRlIDgyOTUxYmNiNjk2MzQ2ZmE4OTk1NWExMWNhMWNjYzA2Lm1k"}}
2026-03-24 10:46:13,194 INFO rag_processor.py:__init__:124 RAGProcessor 9ce4c2ea-f977-46c7-8541-050a06fef853 RAGProcessor initialisiert
{"extra": {"embedding_model": "voyage-3-large", "embedding_dimensions": 2048, "chunk_size": 1000, "chunk_overlap": 200}}
2026-03-24 10:46:13,091 DEBUG base_processor.py:init_logger:386 RAGProcessor 9ce4c2ea-f977-46c7-8541-050a06fef853 Logger initialisiert
2026-03-24 10:46:12,839 INFO pdf_processor.py:_process_mistral_ocr:590 TransformerProcessor job-340a0fa7-176a-4d6c-81d9-dbd464730edc Mistral-OCR: Ergebnis geparst (2 Seiten)
{"progress": 85}
2026-03-24 10:46:12,839 DEBUG pdf_processor.py:_process_mistral_ocr:564 TransformerProcessor job-340a0fa7-176a-4d6c-81d9-dbd464730edc Mistral-OCR: OCR Response Struktur
{"response_keys": ["pages", "model", "document_annotation", "usage_info"], "pages_count": 2, "has_images": false}