Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
W
wendelin
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Léo-Paul Géneau
wendelin
Commits
a5977f84
Commit
a5977f84
authored
Jan 30, 2018
by
Klaus Wölfel
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
data bucket stream: overwrite on reingestion, more api
parent
657559f4
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
31 additions
and
2 deletions
+31
-2
bt5/erp5_wendelin/DocumentTemplateItem/portal_components/document.erp5.DataBucketStream.py
...eItem/portal_components/document.erp5.DataBucketStream.py
+31
-2
No files found.
bt5/erp5_wendelin/DocumentTemplateItem/portal_components/document.erp5.DataBucketStream.py
View file @
a5977f84
...
...
@@ -197,7 +197,7 @@ class DataBucketStream(Document):
"""
Wether bucket with such key exists
"""
return
self
.
_tree
.
has_key
(
key
)
return
key
in
self
.
_tree
def
hasBucketIndex
(
self
,
index
):
"""
...
...
@@ -219,7 +219,11 @@ class DataBucketStream(Document):
self
.
_long_index_tree
.
insert
(
count
,
key
)
except
AttributeError
:
pass
return
self
.
_tree
.
insert
(
key
,
PersistentString
(
value
))
value
=
PersistentString
(
value
)
is_new_key
=
self
.
_tree
.
insert
(
key
,
value
)
if
not
is_new_key
:
self
.
log
(
"Reingestion of same key"
)
self
.
_tree
[
key
]
=
value
def
getBucketKeySequenceByKey
(
self
,
start_key
=
None
,
stop_key
=
None
,
count
=
None
,
exclude_start_key
=
False
,
exclude_stop_key
=
False
):
...
...
@@ -362,3 +366,28 @@ class DataBucketStream(Document):
h
=
hashlib
.
md5
()
h
.
update
(
self
.
getBucketByKey
(
key
))
return
h
.
hexdigest
()
def
delBucketByKey
(
self
,
key
):
"""
Remove the bucket.
"""
del
self
.
_tree
[
key
]
for
index
,
my_key
in
list
(
self
.
getBucketIndexKeySequenceByIndex
()):
if
my_key
==
key
:
del
self
.
_long_index_tree
[
index
]
def
delBucketByIndex
(
self
,
index
):
"""
Remove the bucket.
"""
key
=
self
.
_long_index_tree
[
index
]
del
self
.
_tree
[
key
]
del
self
.
_long_index_tree
[
index
]
def
rebuildIndexTreeByKeyOrder
(
self
):
"""
Clear and rebuild the index tree by order of keys
"""
self
.
initIndexTree
()
for
count
,
key
in
enumerate
(
self
.
getBucketKeySequenceByKey
()):
self
.
_long_index_tree
.
insert
(
count
,
key
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment