Files
scylladb/test/cql-pytest/test_large_cells_rows.py
Nadav Har'El c8117584e3 test/cql-pytest: clean up test_create_large_static_cells_and_rows
The test test_create_large_static_cells_and_rows had its own
implementation of "nodetool flush" using Scylla's REST API.
Now that we have a nodetool.flush() function for general use in
cql-pytest, let's use it and save a bit of duplication.

Another benefit is that now this test can be run (and pass) against
Cassandra.

To allow this test to run on Cassandra, I had to remove a
"USING TIMEOUT" which wasn't necessary for this test, and is
not a feature supported by Cassandra.

Signed-off-by: Nadav Har'El <nyh@scylladb.com>
2021-05-24 12:31:51 +03:00

40 lines
1.7 KiB
Python

# Copyright 2020 ScyllaDB
#
# This file is part of Scylla.
#
# Scylla is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Scylla is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with Scylla. If not, see <http://www.gnu.org/licenses/>.
from util import new_test_table
import requests
import nodetool
def test_create_large_static_cells_and_rows(cql, test_keyspace):
'''Test that `large_data_handler` successfully reports large static cells
and static rows and this doesn't cause a crash of Scylla server.
This is a regression test for https://github.com/scylladb/scylla/issues/6780'''
schema = "pk int, ck int, user_ids set<text> static, PRIMARY KEY (pk, ck)"
with new_test_table(cql, test_keyspace, schema) as table:
insert_stmt = cql.prepare(f"INSERT INTO {table} (pk, ck, user_ids) VALUES (?, ?, ?)")
# Default large data threshold for cells is 1 mb, for rows it is 10 mb.
# Take 10 mb cell to trigger large data reporting code both for
# static cells and static rows simultaneously.
large_set = {'x' * 1024 * 1024 * 10}
cql.execute(insert_stmt, [1, 1, large_set])
nodetool.flush(cql, table)
# No need to check that the Scylla server is running here, since the test will
# fail automatically in case Scylla crashes.