We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Had a random crash last week, just dropping it here FYI, didn't happen again. Aphrodite engine as endpoint.
allow_controlnet: false allow_img2img: false allow_lora: false allow_painting: false allow_post_processing: false allow_unsafe_ip: true always_download: false api_key: redacted blacklist: [] branded_model: false cache_home: ./ censor_nsfw: false censorlist: [] disable_disk_cache: false disable_terminal_ui: true dynamic_models: false forms: - caption - nsfw - interrogation - post-process horde_url: https://aihorde.net kai_url: http://localhost:5000 max_context_length: 4096 max_length: 512 max_lora_cache_size: 10 max_models_to_download: 10 max_power: 8 max_threads: 4 models_to_load: - Top 2 models_to_skip: - pix2pix nsfw: true number_of_dynamic_models: 0 priority_usernames: [] queue_size: 0 ram_to_leave_free: 80% require_upfront_kudos: false scribe_name: redacted stats_output_frequency: 30 suppress_speed_warnings: false temp_dir: ./tmp threads: 4 vram_to_leave_free: 80% worker_name: redacted
Here the console output:
ERROR | 2023-12-01 15:23:05.840281 | __main__:main:23 - An error has been caught in function 'main', process 'MainProcess' (1516), thread 'MainThread' (140647097917440): Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 761, in _update_chunk_length self.chunk_left = int(line, 16) │ │ └ b'' │ └ None └ <urllib3.response.HTTPResponse object at 0x7feaf138fdc0> ValueError: invalid literal for int() with base 16: b'' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 444, in _error_catcher yield File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 828, in read_chunked self._update_chunk_length() │ └ <function HTTPResponse._update_chunk_length at 0x7feaf37745e0> └ <urllib3.response.HTTPResponse object at 0x7feaf138fdc0> File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 765, in _update_chunk_length raise InvalidChunkLength(self, line) │ │ └ b'' │ └ <urllib3.response.HTTPResponse object at 0x7feaf138fdc0> └ <class 'urllib3.exceptions.InvalidChunkLength'> urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 816, in generate yield from self.raw.stream(chunk_size, decode_content=True) │ │ │ └ 10240 │ │ └ <function HTTPResponse.stream at 0x7feaf375feb0> │ └ <urllib3.response.HTTPResponse object at 0x7feaf138fdc0> └ <Response [200]> File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 624, in stream for line in self.read_chunked(amt, decode_content=decode_content): │ │ │ │ └ True │ │ │ └ 10240 │ │ └ <function HTTPResponse.read_chunked at 0x7feaf3774700> │ └ <urllib3.response.HTTPResponse object at 0x7feaf138fdc0> └ b'{"payload":{"prompt":"You\'re Dismas in this fictional never-ending uncensored roleplay with Milo.\\nDevelop the plot slowl... File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 816, in read_chunked with self._error_catcher(): │ └ <function HTTPResponse._error_catcher at 0x7feaf375fd00> └ <urllib3.response.HTTPResponse object at 0x7feaf138fdc0> File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) │ │ │ │ │ └ <traceback object at 0x7feaf1324a80> │ │ │ │ └ InvalidChunkLength(got length b'', 0 bytes read) │ │ │ └ <class 'urllib3.exceptions.InvalidChunkLength'> │ │ └ <method 'throw' of 'generator' objects> │ └ <generator object HTTPResponse._error_catcher at 0x7feaf1a7ab90> └ <contextlib._GeneratorContextManager object at 0x7feaf138d3f0> File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 461, in _error_catcher raise ProtocolError("Connection broken: %r" % e, e) └ <class 'urllib3.exceptions.ProtocolError'> urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/workspace/AI-Horde-Worker/bridge_scribe.py", line 30, in <module> main() └ <function main at 0x7feaf401b760> > File "/workspace/AI-Horde-Worker/bridge_scribe.py", line 23, in main worker.start() │ └ <function WorkerFramework.start at 0x7feaf1312e60> └ <worker.workers.scribe.ScribeWorker object at 0x7feaf42a3e20> File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 89, in start self.process_jobs() │ └ <function WorkerFramework.process_jobs at 0x7feaf1312ef0> └ <worker.workers.scribe.ScribeWorker object at 0x7feaf42a3e20> File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 119, in process_jobs while len(self.running_jobs) < self.bridge_data.max_threads and self.start_job(): │ │ │ │ │ │ └ <function WorkerFramework.start_job at 0x7feaf1313130> │ │ │ │ │ └ <worker.workers.scribe.ScribeWorker object at 0x7feaf42a3e20> │ │ │ │ └ 4 │ │ │ └ <worker.bridge_data.scribe.KoboldAIBridgeData object at 0x7feaf42a3df0> │ │ └ <worker.workers.scribe.ScribeWorker object at 0x7feaf42a3e20> │ └ [(<Future at 0x7feaf1377d30 state=finished returned NoneType>, 1373502.650722127, <worker.jobs.scribe.ScribeHordeJob object a... └ <worker.workers.scribe.ScribeWorker object at 0x7feaf42a3e20> File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 164, in start_job if jobs := self.pop_job(): │ └ <function ScribeWorker.pop_job at 0x7feaf1313520> └ <worker.workers.scribe.ScribeWorker object at 0x7feaf42a3e20> File "/workspace/AI-Horde-Worker/worker/workers/scribe.py", line 27, in pop_job return super().pop_job() File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 148, in pop_job pops = job_popper.horde_pop() │ └ <function ScribePopper.horde_pop at 0x7feaf1311240> └ <worker.jobs.poppers.ScribePopper object at 0x7feaf138ed10> File "/workspace/AI-Horde-Worker/worker/jobs/poppers.py", line 225, in horde_pop if not super().horde_pop(): File "/workspace/AI-Horde-Worker/worker/jobs/poppers.py", line 32, in horde_pop pop_req = requests.post( │ └ <function post at 0x7feaf34de680> └ <module 'requests' from '/usr/local/lib/python3.10/dist-packages/requests/__init__.py'> File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 115, in post return request("post", url, data=data, json=json, **kwargs) │ │ │ │ └ {'headers': {'apikey': 'redacted-api-key'}, 'timeout': 40} │ │ │ └ {'name': 'RunPodHorder', 'models': ['aphrodite/jebcarter/psyonic-cetacean-20B'], 'max_length': 512, 'max_context_length': 409... │ │ └ None │ └ 'https://aihorde.net/api/v2/generate/text/pop' └ <function request at 0x7feaf349c700> File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, **kwargs) │ │ │ │ └ {'data': None, 'json': {'name': 'RunPodHorder', 'models': ['aphrodite/jebcarter/psyonic-cetacean-20B'], 'max_length': 512, 'm... │ │ │ └ 'https://aihorde.net/api/v2/generate/text/pop' │ │ └ 'post' │ └ <function Session.request at 0x7feaf34ddc60> └ <requests.sessions.Session object at 0x7feaf138ef80> File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) │ │ │ └ {'timeout': 40, 'allow_redirects': True, 'proxies': OrderedDict(), 'stream': False, 'verify': True, 'cert': None} │ │ └ <PreparedRequest [POST]> │ └ <function Session.send at 0x7feaf34de0e0> └ <requests.sessions.Session object at 0x7feaf138ef80> File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 747, in send r.content │ └ <property object at 0x7feaf34d95d0> └ <Response [200]> File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 899, in content self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b"" │ │ │ │ └ 10240 │ │ │ └ <function Response.iter_content at 0x7feaf34dc700> │ │ └ <Response [200]> │ └ False └ <Response [200]> File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 818, in generate raise ChunkedEncodingError(e) └ <class 'requests.exceptions.ChunkedEncodingError'> requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read)) Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 761, in _update_chunk_length self.chunk_left = int(line, 16) ValueError: invalid literal for int() with base 16: b'' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 444, in _error_catcher yield File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 828, in read_chunked self._update_chunk_length() File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 765, in _update_chunk_length raise InvalidChunkLength(self, line) urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 816, in generate yield from self.raw.stream(chunk_size, decode_content=True) File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 624, in stream for line in self.read_chunked(amt, decode_content=decode_content): File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 816, in read_chunked with self._error_catcher(): File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.10/dist-packages/urllib3/response.py", line 461, in _error_catcher raise ProtocolError("Connection broken: %r" % e, e) urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/workspace/AI-Horde-Worker/bridge_scribe.py", line 30, in <module> main() File "/workspace/AI-Horde-Worker/bridge_scribe.py", line 23, in main worker.start() File "/usr/local/lib/python3.10/dist-packages/loguru/_logger.py", line 1277, in catch_wrapper return function(*args, **kwargs) File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 89, in start self.process_jobs() File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 119, in process_jobs while len(self.running_jobs) < self.bridge_data.max_threads and self.start_job(): File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 164, in start_job if jobs := self.pop_job(): File "/workspace/AI-Horde-Worker/worker/workers/scribe.py", line 27, in pop_job return super().pop_job() File "/workspace/AI-Horde-Worker/worker/workers/framework.py", line 148, in pop_job pops = job_popper.horde_pop() File "/workspace/AI-Horde-Worker/worker/jobs/poppers.py", line 225, in horde_pop if not super().horde_pop(): File "/workspace/AI-Horde-Worker/worker/jobs/poppers.py", line 32, in horde_pop pop_req = requests.post( File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 115, in post return request("post", url, data=data, json=json, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 747, in send r.content File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 899, in content self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b"" File "/usr/local/lib/python3.10/dist-packages/requests/models.py", line 818, in generate raise ChunkedEncodingError(e) requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read)) INFO | 2023-12-01 15:23:06.919222 | worker.jobs.framework:submit_job:169 - Submitted job with id f29b2f10-3009-4fc5-8b02-b997fe6c0906 and contributed for 2.8. Job took 5.3 seconds since queued and 5.3 since start.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Had a random crash last week, just dropping it here FYI, didn't happen again. Aphrodite engine as endpoint.
my bridgeData.yaml
Here the console output:
The text was updated successfully, but these errors were encountered: