Skip to content

Commit

Permalink
refactoring, chagelog
Browse files Browse the repository at this point in the history
  • Loading branch information
droserasprout committed Oct 29, 2024
1 parent d8eb4be commit 9fc2160
Show file tree
Hide file tree
Showing 40 changed files with 213 additions and 266 deletions.
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,16 @@ Releases prior to 7.0 has been removed from this file to declutter search result

## [Unreleased]

### Added

- substrate.events: Added `subtrate.events` index kind to process Substrate events.
- substrate.node: Added `subtrate.node` datasource to receive data from Substrate node.
- substrate.subscan: Added `substrate.subscan` datasource to fetch ABIs from Subscan.
- substrate.subsquid: Added `substrate.subsquid` datasource to fetch historical data from Squid Network.

### Fixed

- cli: Don't wrap exceptions with `CallbackError` to avoid shadowing the original exception.
- cli: Fixed `--template` option being ignored when `--quiet` flag is set.
- config: Fixed setting default loglevels when `logging` is a dict.

Expand Down
5 changes: 5 additions & 0 deletions docs/9.release-notes/_8.0_changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,17 @@
- starknet.events: Added `starknet.events` index kind to process Starknet events.
- starknet.node: Added Starknet node datasource for last mile indexing.
- starknet.subsquid: Added `starknet.subsquid` datasource to fetch historical data from Subsquid Archives.
- substrate.events: Added `subtrate.events` index kind to process Substrate events.
- substrate.node: Added `subtrate.node` datasource to receive data from Substrate node.
- substrate.subscan: Added `substrate.subscan` datasource to fetch ABIs from Subscan.
- substrate.subsquid: Added `substrate.subsquid` datasource to fetch historical data from Squid Network.
- tezos.operations: Added `sr_cement` operation type to process Smart Rollup Cemented Commitments.

### Fixed

- cli: Don't save reports for successful test runs.
- cli: Don't update existing installation in `self install` command unless asked to.
- cli: Don't wrap exceptions with `CallbackError` to avoid shadowing the original exception.
- cli: Fixed `--pre` installer flag.
- cli: Fixed `--template` option being ignored when `--quiet` flag is set.
- cli: Fixed env files not being loaded in some commands.
Expand Down
118 changes: 59 additions & 59 deletions pdm.lock

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ dependencies = [
"python-dotenv~=1.0",
"python-json-logger~=2.0",
"ruamel.yaml~=0.18.6",
"scalecodec~=1.2",
"sentry-sdk~=2.16",
"sqlparse~=0.5",
"starknet-py==0.24.0",
Expand All @@ -79,7 +80,6 @@ dependencies = [
"tortoise-orm==0.21.7",
"uvloop~=0.20",
"web3~=7.2",
"scalecodec",
]

[project.optional-dependencies]
Expand Down
6 changes: 3 additions & 3 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ apscheduler==3.10.4
argcomplete==3.5.1
asgiref==3.8.1
async-lru==2.0.4
asyncpg==0.29.0
asyncpg==0.30.0
attrs==24.2.0
base58==2.1.1
bitarray==3.0.0
Expand Down Expand Up @@ -56,7 +56,7 @@ mpmath==1.3.0
msgpack==1.1.0
multidict==6.1.0
mypy-extensions==1.0.0
orjson==3.10.7
orjson==3.10.10
packaging==24.1
parsimonious==0.10.0
pathspec==0.12.1
Expand Down Expand Up @@ -88,7 +88,7 @@ sniffio==1.3.1
sqlparse==0.5.1
starknet-py==0.24.0
strict-rfc3339==0.7
survey==5.4.0
survey==5.4.2
sympy==1.11.1
tabulate==0.9.0
toolz==1.0.0; implementation_name == "pypy" or implementation_name == "cpython"
Expand Down
5 changes: 0 additions & 5 deletions src/demo_blank/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_evm_events/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_evm_transactions/replay.yaml

This file was deleted.

7 changes: 0 additions & 7 deletions src/demo_evm_uniswap/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_starknet_events/replay.yaml

This file was deleted.

3 changes: 2 additions & 1 deletion src/demo_substrate_events/deploy/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
FROM dipdup/dipdup:8
# FROM dipdup/dipdup:8
# FROM ghcr.io/dipdup-io/dipdup:8
# FIXME: substrate preview
FROM ghcr.io/dipdup-io/dipdup:feat-substrate

# COPY --chown=dipdup pyproject.toml README.md .
Expand Down
5 changes: 0 additions & 5 deletions src/demo_substrate_events/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_auction/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_dao/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_dex/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_domains/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_etherlink/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_events/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_factories/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_head/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_nft_marketplace/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_raw/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_token/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_token_balances/replay.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions src/demo_tezos_token_transfers/replay.yaml

This file was deleted.

12 changes: 7 additions & 5 deletions src/dipdup/codegen/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,6 +169,12 @@ async def _generate_type(self, schema_path: Path, force: bool) -> None:
class_name = self.get_typeclass_name(schema_path)
self._logger.info('Generating type `%s`', class_name)
output_path.parent.mkdir(parents=True, exist_ok=True)
# TODO: make it configurable
model_type = (
dmcg.DataModelType.TypingTypedDict
if 'substrate' in str(output_path)
else dmcg.DataModelType.PydanticV2BaseModel
)
dmcg.generate(
input_=schema_path,
output=output_path,
Expand All @@ -178,11 +184,7 @@ async def _generate_type(self, schema_path: Path, force: bool) -> None:
target_python_version=dmcg.PythonVersion.PY_312,
custom_file_header=CODEGEN_HEADER,
use_union_operator=True,
output_model_type=(
dmcg.DataModelType.TypingTypedDict
if self.kind == 'substrate'
else dmcg.DataModelType.PydanticV2BaseModel
),
output_model_type=model_type,
use_schema_description=True,
)

Expand Down
4 changes: 1 addition & 3 deletions src/dipdup/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -736,8 +736,6 @@ def _callback_wrapper(self, module: str) -> Iterator[None]:
# NOTE: Do not wrap known errors like ProjectImportError
except FrameworkException:
raise
# except Exception as e:
# raise CallbackError(module, e) from e

def _get_handler(self, name: str, index: str) -> HandlerConfig:
try:
Expand Down Expand Up @@ -882,4 +880,4 @@ def _wrap(
@property
def is_finalized(self) -> bool:
# FIXME: check the datasource
return 1
return True
2 changes: 1 addition & 1 deletion src/dipdup/datasources/abi_etherscan.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ async def get_abi_failover(self, address: str) -> dict[str, Any]:
)
).text()

regex = r'id="js-copytextarea2(.*)>(\[.*?)\<\/pre'
regex = r'id=["\']js-copytextarea2(.*)>(\[.*?)\<\/pre'
if (match := re.search(regex, html)) and (abi := match.group(2)):
return cast(dict[str, Any], orjson.loads(abi))
raise DatasourceError('Failed to get ABI', self.name)
8 changes: 4 additions & 4 deletions src/dipdup/datasources/evm_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,10 +128,10 @@ async def _emitter_loop(self) -> None:
# NOTE: Push rollback to all EVM indexes, but continue processing.
if head.level <= known_level:
for type_ in (
SubsquidMessageType.blocks,
SubsquidMessageType.logs,
SubsquidMessageType.traces,
SubsquidMessageType.transactions,
SubsquidMessageType.evm_blocks,
SubsquidMessageType.evm_logs,
SubsquidMessageType.evm_traces,
SubsquidMessageType.evm_transactions,
):
await self.emit_rollback(
type_,
Expand Down
90 changes: 90 additions & 0 deletions src/dipdup/indexes/_subsquid.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
import random
from abc import ABC
from abc import abstractmethod
from typing import TYPE_CHECKING
from typing import Any
from typing import Generic
from typing import TypeVar

if TYPE_CHECKING:
from dipdup.context import DipDupContext
from dipdup.datasources.evm_node import NODE_LAST_MILE
from dipdup.datasources.evm_node import EvmNodeDatasource
from dipdup.index import Index
from dipdup.index import IndexQueueItemT
from dipdup.performance import metrics

IndexConfigT = TypeVar('IndexConfigT', bound=Any)
DatasourceT = TypeVar('DatasourceT', bound=Any)


class SubsquidIndex(
Generic[IndexConfigT, IndexQueueItemT, DatasourceT],
Index[IndexConfigT, IndexQueueItemT, DatasourceT],
ABC,
):
subsquid_datasources: tuple[Any, ...]
node_datasources: tuple[Any, ...]

def __init__(self, ctx: 'DipDupContext', config: IndexConfigT, datasources: tuple[DatasourceT, ...]) -> None:
super().__init__(ctx, config, datasources)
self._subsquid_started: bool = False

@abstractmethod
async def _synchronize_subsquid(self, sync_level: int) -> None: ...

@abstractmethod
async def _synchronize_node(self, sync_level: int) -> None: ...

async def _get_node_sync_level(
self,
subsquid_level: int,
index_level: int,
node: EvmNodeDatasource | None = None,
) -> int | None:
if not self.node_datasources:
return None
node = node or random.choice(self.node_datasources)

node_sync_level = await node.get_head_level()
subsquid_lag = abs(node_sync_level - subsquid_level)
subsquid_available = subsquid_level - index_level
self._logger.info('Subsquid is %s levels behind; %s available', subsquid_lag, subsquid_available)
if subsquid_available < NODE_LAST_MILE:
return node_sync_level
return None

async def _synchronize(self, sync_level: int) -> None:
"""Fetch event logs via Fetcher and pass to message callback"""
index_level = await self._enter_sync_state(sync_level)
if index_level is None:
return

levels_left = sync_level - index_level
if levels_left <= 0:
return

if self.subsquid_datasources:
subsquid_sync_level = await self.subsquid_datasources[0].get_head_level()
metrics._sqd_processor_chain_height = subsquid_sync_level
else:
subsquid_sync_level = 0

node_sync_level = await self._get_node_sync_level(subsquid_sync_level, index_level)

# NOTE: Fetch last blocks from node if there are not enough realtime messages in queue
if node_sync_level:
sync_level = min(sync_level, node_sync_level)
self._logger.debug('Using node datasource; sync level: %s', sync_level)
await self._synchronize_node(sync_level)
else:
sync_level = min(sync_level, subsquid_sync_level)
await self._synchronize_subsquid(sync_level)

if not self.node_datasources and not self._subsquid_started:
self._subsquid_started = True
self._logger.info('No `evm.node` datasources available; polling Subsquid')
for datasource in self.subsquid_datasources:
await datasource.start()

await self._exit_sync_state(sync_level)
Loading

0 comments on commit 9fc2160

Please sign in to comment.