-
Notifications
You must be signed in to change notification settings - Fork 566
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High memory consumption INSERTing Decimal type #1293
Comments
@jkaflik any timeframe on this? |
@flarco, thanks for reporting this. There is definitely some unexpected behavior here. Could you clarify on:
please? @alisman initially assigned me to triage. Currently, we don't have the capacity to have a look shortly. Any contributions are welcome. |
Limiting the number of rows inserted per transaction helps a lot with lowering the memory usage. So that works as a work-around. The issue stands, though: if millions of rows are inserted in 1 transaction, the memory leakage crashes the process. |
Same thing for us. We are inserting around 1 mln records per 1 insert (we can't decrease it unfortunately, too much data) and memory consumption is enourmous. |
INSERT
?
@jkaflik I'll answer you instead of @gofort We insert 1M records, each record has 52 columns. Here is the structure we pass to method type Item struct {
FieldA time.Time `ch:"field_a"`
FieldB time.Time `ch:"field_b"`
FieldC net.IP `ch:"field_c"`
FieldD string `ch:"field_d"`
FieldE string `ch:"field_e"`
FieldF string `ch:"field_f"`
FieldG string `ch:"field_g"`
FieldH string `ch:"field_h"`
FieldI uint16 `ch:"field_i"`
FieldJ int64 `ch:"field_j"`
FieldK string `ch:"field_k"`
FieldL string `ch:"field_l"`
FieldM int64 `ch:"field_m"`
FieldN string `ch:"field_n"`
FieldO uint32 `ch:"field_o"`
FieldP string `ch:"field_p"`
FieldQ []uint32 `ch:"field_q"`
FieldR []int64 `ch:"field_r"`
FieldS string `ch:"field_s"`
FieldT []uint16 `ch:"field_t"`
FieldU []uint32 `ch:"field_u"`
FieldV []uint32 `ch:"field_v"`
FieldW int32 `ch:"field_w"`
FieldX int32 `ch:"field_x"`
FieldY string `ch:"field_y"`
FieldZ net.IP `ch:"field_z"`
FieldAA string `ch:"field_aa"`
FieldAB string `ch:"field_ab"`
FieldAC string `ch:"field_ac"`
FieldAD uint32 `ch:"field_ad"`
FieldAE string `ch:"field_ae"`
FieldAF string `ch:"field_af"`
FieldAG string `ch:"field_ag"`
FieldAH string `ch:"field_ah"`
FieldAI string `ch:"field_ai"`
FieldAJ string `ch:"field_aj"`
FieldAK string `ch:"field_ak"`
FieldAL string `ch:"field_al"`
FieldAM string `ch:"field_am"`
FieldAN string `ch:"field_an"`
FieldAO uint8 `ch:"field_ao"`
FieldAP string `ch:"field_ap"`
FieldAQ []net.IP `ch:"field_aq"`
FieldAR uint64 `ch:"field_ar"`
FieldAS string `ch:"field_as"`
FieldAT uint32 `ch:"field_at"`
FieldAU uint32 `ch:"field_au"`
FieldAV string `ch:"field_av"`
FieldAW uint16 `ch:"field_aw"`
FieldAX uint16 `ch:"field_ax"`
FieldAY int8 `ch:"field_ay"`
FieldAZ string `ch:"field_az"`
} The table in the database contains the following columns:
|
@vkazmirchuk what is memory usage you notice? Do you have pprof memory report? I strongly recommend to create another issue. I want this one to be strictly focused on Decimal type support as originally mentioned by issue creator. |
Thanks! I created issue #1384 |
Observed
Hi, I am seeing a big memory usage from the
resultFromStatement
after making aINSERT
query.I believe the leak is happening here:
clickhouse-go/lib/column/decimal.go
Line 238 in 28fd6a4
See graph below.
This happens when I insert millions of rows with Decimal & Int64 values.
The decimal values are made using
github.com/shopspring/decimal
, withdecimal.NewFromString
.I am actually not using any of the result since I am making an
INSERT
. Not sure why it's appending from result and taking so much memory.Here is the pprof output:
Expected behaviour
Should not take up much memory after an
INSERT
call.Code example
See https://github.com/slingdata-io/sling-cli/blob/main/core/dbio/database/database_clickhouse.go#L157
Environment
clickhouse-go
version: v2.24.0database/sql
yandex/clickhouse-server:21.3
CREATE TABLE
statements for tables involved:The text was updated successfully, but these errors were encountered: