refactor: applied tooling for predictable lint/fmt/commits

Applied `rye lint --fix`, `rye fmt`, and commitizen checking to better control the project development.
main
Sofus Albert Høgsbro Rose 2024-05-04 20:08:33 +02:00
parent ea8e4104ff
commit a7e3c17c86
Signed by: so-rose
GPG Key ID: AD901CB0F3701434
52 changed files with 952 additions and 821 deletions

View File

@ -10,3 +10,7 @@ trim_trailing_whitespace = false
[*.yml]
indent_style = space
indent_size = 2
[*.yaml]
indent_style = space
indent_size = 2

View File

@ -0,0 +1,19 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.4.3
hooks:
# ruff lint
#- id: ruff
# args: [ --fix ]
# ruff fmt
- id: ruff-format
- repo: https://github.com/commitizen-tools/commitizen
rev: master
hooks:
- id: commitizen
- id: commitizen-branch
stages: [push]

550
TODO.md
View File

@ -54,553 +54,3 @@
- [x] Physical Constant
- [ ] Fix many problems by persisting `_enum_cb_cache` and `_str_cb_cache`.
# VALIDATE
- [ ] Does the imaginary part of a complex phasor scale with the real part? Ex. when doing `V/m -> V/um` conversion, does the phase also scale by 1 million?
# Nodes
## Analysis
- [x] Extract
- [ ] Implement "saved" state for info, and provide the user an indicator that state has been saved w/a button to reset (the state should also be reset when plugging a new data thing in)
- [x] Viz
- [ ] Implement Info-driven planar projection of pixels onto managed image empty.
- [ ] Live-slice 2D field values onto user-controlled image empty from 2D field.
- [ ] SocketType-based visualization support.
- [ ] Pol SocketType: 2D elliptical visualization of Jones vectors.
- [ ] Pol SocketType: 3D Poincare sphere visualization of Stokes vectors.
- [x] Math / Operate Math
- [ ] Remove two-layered dropdown; directly filter operations and use categories to seperate them.
- [ ] Implement Expr socket advancements to make a better experience operating between random expression-like sockets.
- [x] Math / Map Math
- [x] Remove "By x" socket set let socket sets only be "Function"/"Expr"; then add a dynamic enum underneath to select "By x" based on data support.
- [ ] Filter the operations based on data support, ex. use positive-definiteness to guide cholesky.
- [ ] Implement support for additional symbols via `Expr`.
- [x] Math / Filter Math
- [ ] Math / Reduce Math
## Inputs
- [x] Wave Constant
- [x] Scene
- [ ] Implement export of scene time via. Blender unit system.
- [ ] Implement optional scene-synced time exporting, so that the simulation definition and scene definition match for analysis needs.
- [x] Constants / Expr Constant
- See IDEAS.
- [x] Constants / Number Constant
- [x] Constants / Vector Constant
- [x] Constants / Physical Constant
- [x] Constants / Scientific Constant
- [ ] Nicer (boxed?) node information, maybe centered headers, in a box, etc. .
- [ ] Constants / Unit System Constant
- [ ] Re-implement with `PhysicalType`.
- [ ] Implement presets, including "Tidy3D" and "Blender", shown in the label row.
- [ ] Constants / Blender Constant
- [ ] Fix it!
- [ ] Web / Tidy3D Web Importer
- [ ] Fix the check of folders, actually, just fix `tdcloud` in general!
- [ ] Have a visual indicator for the download status of the currently selected task, as well as its data size.
- [ ] If a task is "selected", lock the cloud task socket, so other tasks can't be selected. While that lock is active, expose a real "download" button. Also make the loose output socket and put out a `FlowPending` until the download is available.
- [ ] A manual download button and seperate re-download button (maybe on the side, round reload boi).
- [ ] An option to pack the data into the blend, with overview of how much data it will take (Base85/base64 has overhead).
- [ ] Default limits for caching/packing.
- [ ] Support importing batched simulations and outputting an `Array` of SimData.
- [ ] File Import / Data File Import
- [ ] Implement `FlowKind.LazyValueFunc` that plays the loading game.
- [ ] Implement `FlowKind.Info` which lets the user describe the data being loaded, for proper further processing.
- [ ] Implement unit system input to guide conversion from numpy data type.
- [ ] Implement datatype dropdown to guide format from disk, prefilled to detected.
- [ ] Implement `FlowKind.Array` that just runs the `LazyValueFunc` as usual.
- [ ] Standardize 1D and 2D array loading/saving on numpy's savetxt with gzip enabled.
- [x] File Import / Tidy3D File Import
## Outputs
- [x] Viewer
- [ ] Consider a "debug" mode
- [ ] Auto-enable plot when creating.
- [ ] Test/support multiple viewers at the same time.
- [ ] Pop-up w/multiline string as alternative to console print.
- [ ] Handle per-tree viewers, so that switching trees doesn't "bleed" state from the old tree.
- [ ] BUG: CTRL+SHIFT+CLICK not on a node shows an error; should just do nothing.
- [x] Web Export / Tidy3D Web Exporter
- [ ] Run checks on-demand, and require they be run before the sim can be uploaded. If the simulation changes, don't
- [ ] Support doing checks in a seperate process.
- [ ] We need better ways of doing checks before uploading, like for monitor data size. Maybe a SimInfo node?
- [ ] Accept `Array` of simulations, and upload them as `Batch`.
- [x] File Export / JSON File Export
- [ ] Reevaluate its purpose.
- [ ] File Export / Tidy3D File Export
- [ ] Implement HDF-based export of Tidy3D-exported object (which includes ex. mesh data and such)
- [ ] Also JSON (but indicate somehow that ex. mesh data doesn't come along for the ride).
- [ ] File Export / Data File Export
- [ ] Implement datatype dropdown to guide format on disk.
- [ ] Implement unit system input to guide conversion to numpy data type.
- [ ] Standardize 1D and 2D array loading/saving on numpy's savetxt with gzip enabled.
## Sources
- [x] Temporal Shapes / Gaussian Pulse Temporal Shape
- [x] Temporal Shapes / Continuous Wave Temporal Shape
- [ ] Merge Gaussian Pulse and Continuous Wave w/a socket set thing, since the I/O is effectively identical.
- [ ] Temporal Shapes / Expr Temporal Shape
- [ ] Specify a Sympy function / data to generate envelope data.
- [ ] Merge with the above.
- [x] Point Dipole Source
- [ ] Use a viz mesh, not empty (empty doesn't play well with alpha hashing).
- [ ] Plane Wave Source
- [ ] **IMPORTANT**: Fix the math so that an actually valid construction emerges!!
- [ ] Uniform Current Source
- [ ] TFSF Source
- [ ] Gaussian Beam Source
- [ ] Astigmatic Gaussian Beam Source
- [ ] EH Array Source
- [ ] EH Equiv Array Source
## Mediums
- [x] Library Medium
- [ ] Implement wavelength-based plot, as opposed to merely the frequency plot.
- [ ] DataFit Medium
- [ ] Implement by migrating the material data fitting logic from the `Tidy3D File Importer`, except now only accept a `Data` input socket, and rely on the `Data File Importer` to do the parsing into an acceptable `Data` socket format.
- [ ] Save the result in the node, specifically in a property (serialized!) and lock the input graph while saved.
- [ ] PEC Medium
- [ ] Isotropic Medium
- [ ] Anisotropic Medium
- [ ] Sellmeier Medium
- [ ] Drude Medium
- [ ] Drude-Lorentz Medium
- [ ] Debye Medium
- [ ] Pole-Residue Medium
- [ ] Non-Linearity / `chi_3` Susceptibility Non-Linearity
- [ ] Non-Linearity / Two-Photon Absorption Non-Linearity
- [ ] Non-Linearity / Kerr Non-Linearity
## Structures
- [ ] BLObject Structure
- [x] GeoNodes Structure
- [ ] Implement a panel system, to make GN trees with a ton of inputs (most of which are not usually needed) actually useful.
- [ ] Propertly map / implement Enum input sockets to the GN group.
- [ ] Primitive Structures / Line Structure
- [ ] Primitive Structures / Plane Structure
- [x] Primitive Structures / Box Structure
- [x] Primitive Structures / Sphere Structure
- [ ] Primitive Structures / Cylinder Structure
- [ ] Primitive Structures / PolySlab Structure
## Bounds
- [x] Boundary Conds
- [x] Boundary Cond / PML Bound Cond
- [ ] 1D plot visualizing the effect of parameters on a 1D wave function
- [x] Boundary Cond / Bloch Bound Cond
- [x] Implement "simple" mode aka "periodic" mode in Tidy3D
- [ ] 1D plot visualizing the effect of parameters on a 1D wave function
- [x] Boundary Cond / Absorbing Bound Cond
- [ ] 1D plot visualizing the effect of parameters on a 1D wave function
## Monitors
- [x] EH Field Monitor
- [ ] Method of setting `inf` on dimensions - use a `ManyEnum` maybe to select the injection axis, and let that set the $0$.
- [ ] Revamp the input parameters.
- [x] Power Flux Monitor
- [ ] Permittivity Monitor
- [ ] Diffraction Monitor
- [ ] Projected E/H Field Monitor / Cartesian Projected E/H Field Monitor
- [ ] Use to implement the metalens: <https://docs.flexcompute.com/projects/tidy3d/en/latest/notebooks/Metalens.html>
- [ ] Projected E/H Field Monitor / Angle Projected E/H Field Monitor
- [ ] Projected E/H Field Monitor / K-Space Projected E/H Field Monitor
## Simulations
- [x] FDTDSim
- [ ] By-Medium batching of Structures when building the td.Simulation object, which can have significant performance implications.
- [x] Sim Domain
- [ ] Sim Grid
- [ ] Sim Grid Axes / Auto Sim Grid Axis
- [ ] Sim Grid Axes / Manual Sim Grid Axis
- [ ] Sim Grid Axes / Uniform Sim Grid Axis
- [ ] Sim Grid Axes / Data Sim Grid Axis
## Utilities
- [ ] Separate
- [ ] Use generic Expr socket mode to combine numerical types into either Expr or Data socket.
- [x] Combine
- [ ] Use generic Expr socket mode to combine numerical types into either Expr or Data socket.
- [ ] Explicit about lower structures taking precedence.
# GeoNodes
- [ ] Tests / Monkey (suzanne deserves to be simulated, she may need manifolding up though :))
- [ ] Tests / Wood Pile
- [ ] Structures / Primitives / Line
- [ ] Structures / Primitives / Plane
- [x] Structures / Primitives / Box
- [x] Structures / Primitives / Sphere
- [ ] Structures / Primitives / Cylinder
- [x] Structures / Primitives / Ring
- [ ] Structures / Arrays / Cyl
- [ ] Structures / Arrays / Box
- [ ] Structures / Arrays / Sphere
- [ ] Structures / Arrays / Cylinder
- [x] Structures / Arrays / Ring
- [ ] Structures / Hex Arrays / Cyl
- [ ] Structures / Hex Arrays / Box
- [ ] Structures / Hex Arrays / Sphere
- [ ] Structures / Hex Arrays / Cylinder
- [x] Structures / Hex Arrays / Ring
- [ ] Structures / Cavity Arrays / L-Cavity Cylinder
- [ ] Structures / Cavity Arrays / H-Cavity Cylinder
- [ ] Structures / Lattice Arrays / FCC Sphere
- [ ] Structures / Lattice Arrays / BCC Sphere
# Benchmark / Example Sims
- [ ] Research-Grade Experiment
- Membrane 15nm thickness suspended in air
- Square lattice of holes period 900nm (900nm between each hole, air inside holes)
- Holes square radius 100nm
- Square lattice
- Analysis of transmission
- Guided mode resonance
- [ ] Tunable Chiral Metasurface <https://docs.flexcompute.com/projects/tidy3d/en/latest/notebooks/TunableChiralMetasurface.html>
# Sockets
## Basic
- [x] Any
- [x] Bool
- [x] String
- [x] File Path
- [x] Color
- [x] Expr
- [ ] Implement node-driven support for dynamic symbols.
- [ ] Implement compatibility with sockets that fundamentally do produce expressions, especially Physical sockets.
## Number
- [x] Integer
- [x] Rational
- [x] Real
- [ ] Implement min/max for ex. 0..1 factor support.
- [x] Complex
## Blender
- [x] Object
- [ ] Implement default object name in SocketDef
- [x] Collection
- [ ] Implement default collection name in SocketDef
- [x] Image
- [ ] Implement default image name in SocketDef
- [x] GeoNodes
- [ ] Implement default SocketDef geonodes name
- [x] Text
- [ ] Implement default SocketDef object name
## Maxwell
- [x] Bound Conds
- [ ] Bound Cond
- [x] Medium
- [ ] Medium Non-Linearity
- [x] Source
- [ ] Temporal Shape
- [ ] Sane-default pulses for easy access.
- [ ] Structure
- [ ] Monitor
- [ ] FDTD Sim
- [ ] Sim Domain
- [ ] Toggleable option to push-sync the simulation time duration to the scene end time (how to handle FPS vs time-step? Should we adjust the FPS such that there is one time step per frame, while keeping the definition of "second" aligned to the Blender unit system?)
- [ ] Sim Grid
- [ ] Sim Grid Axis
- [ ] Simulation Data
## Tidy3D
- [x] Cloud Task
- [ ] Move API checking out of the socket, and don't re-prompt for a key if the config file exists.
- [ ] Remove the existing task selector when making a new task.
- [ ] Implement "new folder" feature w/popup operator.
- [ ] Implement "delete task" feature w/popup confirmation.
## Physical
- [x] Unit System
- [ ] Presets for Blender and Tidy3D
- [ ] Dropdowns in the socket UI
- [x] Time
- [x] Angle
- [ ] Remove superfluous units.
- [ ] Solid Angle (steradian)
- [x] Frequency (hertz)
- [ ] Angular Frequency (`rad*hertz`)
### Cartesian
- [x] Length
- [x] Area
- [x] Volume
- [ ] Point 1D
- [ ] Point 2D
- [x] Point 3D
- [ ] Size 2D
- [x] Size 3D
- [ ] Rotation 3D
- [ ] Implement Euler methods
- [ ] Implement Quaternion methods
### Mechanical
- [ ] Mass
- [x] Speed
- [ ] Velocity 3D
- [x] Acceleration Scalar
- [ ] Acceleration 3D
- [x] Force Scalar
- [ ] Force 3D
- [ ] Pressure
### Energy
- [ ] Energy (joule)
- [ ] Power (watt)
- [ ] Temperature
### Electrodynamical
- [ ] Current (ampere)
- [ ] Current Density 3D
- [ ] Charge (coulomb)
- [ ] Voltage (volts)
- [ ] Capacitance (farad)
- [ ] Resistance (ohm)
- [ ] Electric Conductance (siemens)
- [ ] Magnetic Flux (weber)
- [ ] Magnetic Flux Density (tesla)
- [ ] Inductance (henry)
- [ ] Electric Field 3D (`volt*meter`)
- [ ] Magnetic Field 3D (tesla)
### Luminal
- [ ] Luminous Intensity (candela)
- [ ] Luminous Flux (lumen)
- [ ] Illuminance (lux)
### Optical
- [ ] Jones Polarization
- [ ] Polarization (Stokes)
# Internal / Architecture
## CRITICAL
- [ ] Rethink the way that loose sockets are replaced, specifically with respect to deterministic ordering.
- Currently order is not guaranteed. This is causing problems.
## User-Facing Errors and Legal Considerations
- [ ] `log.error` should invoke `self.report` in some Blender operator - used for errors that are due to usage error (which can't simply be prevented with UX design, like text file formatting of import), not due to error in the program.
- [ ] License header UI for MaxwellSimTrees, to clarify the AGPL-compatible potentially user-selected license that trees must be distributed under.
- [ ] A "CitationsFlow" FlowKind which simply propagates citations.
- [ ] Implement standardization of nodes/sockets w/individualized SemVer
- Perhaps keep node / socket versions in a property, so that trying to load an incompatible major version hop can error w/indicator of where to find a compatible `blender_maxwell` version.
- Integrate w/BLField, to help the user manage addon updates that would break their tree.
## Documentation
- [ ] Make all modules available
- [ ] Publish documentation site.
- [ ] Initial user guides w/pictures.
- [ ] Comb through and finish `__doc__`s.
## Performance
- [ ] Optimize GN value pushing w/sympy expression hashing.
## Style
Header color style can't be done, unfortunately. Body color feels unclean, so nothing there for now.
- [ ] Node icons to denote preview/plot state.
## Registration and Contracts
- [ ] Refactor the node category code; it's ugly.
- It's maybe not that easy. And it seems to work with surprising reliability. Leave it alone for now!
- [ ] (?) Would be nice with some kind of indicator somewhere to help set good socket descriptions when making geonodes.
## Managed Objects
- [ ] Implement ManagedEmpty
- [ ] Implement image-based empty connected to an image (which is managed by a different ManagedImage owned by the same node instance)
- [ ] Implement ManagedVol
- [ ] Implement loading the xarray-defined voxels into OpenVDB, saving it, and loading it as a managed BL object with the volume setting.
- [ ] Implement basic jax-driven volume voxel processing, especially cube based slicing.
- [ ] Implement jax-driven linear interpolation of volume voxels to an image texture, whose pixels are sized according to the dimensions of another managed plane object (perhaps a uniquely described Managed BL object itself).
## Utils or Services
- [ ] Document the `tdcloud` service thoroughly and open a GitHub discussion about `td.web` shortcomings.
## Node Base Class
- [ ] Re-engineer "presets" to use an Enum of some kind.
## Events
- [ ] When a Blender object is selected, select the node that owns its ManagedObj.
- [ ] Node button / shortcut / something to select the ManagedObj owned by a node.
- Sync transformation of Blender object by user to its node properties.
- See <https://archive.blender.org/developer/P563>
- Also see <https://blender.stackexchange.com/questions/150809/how-to-get-an-event-when-an-object-is-selected>
## Socket Base Class
- [ ] Collect `SocketDef` objects like we do with `BL_REGISTER`, without any special mojo sauce.
## Many Nodes
- [ ] Implement "Steady-State" / "Time Domain" on all relevant Monitor nodes
- [ ] Medium Features
- [ ] Accept spatial field. Else, spatial uniformity.
- [ ] Accept non-linearity. Else, linear.
- [ ] Accept space-time modulation. Else, static.
- [ ] Modal Features
- ModeSpec, for use by ModeSource, ModeMonitor, ModeSolverMonitor. Data includes ModeSolverData, ModeData, ScalarModeFieldDataArray, ModeAmpsDataArray, ModeIndexDataArray, ModeSolver.
## Many Sockets
## Development Tooling
- [ ] Implement `pre-commit.
- [ ] Pass a `mypy` check
- [ ] Pass all `ruff` checks, including `__doc__` availability.
- [ ] Add profiling support, so we can properly analyze performance characteristics.
- Without a test harness, or profile-while-logging, there may be undue noise in our analysis.
- [ ] Simple `pytest` harnesses for unit testing of nodes, sockets.
- Start with the low-hanging-fruit stuff. Eventually, work towards wider code coverage w/headless Blender.
## Version Churn
- [ ] Migrate to StrEnum sockets (py3.11).
- [ ] Implement drag-and-drop node-from-file via bl4.1 file handler API.
- [ ] Start thinking about ways around `__annotations__` hacking.
- [ ] Prepare for for multi-input sockets (bl4.2)
- PR has been merged: <https://projects.blender.org/blender/blender/commit/14106150797a6ce35e006ffde18e78ea7ae67598> (for now, just use the "Combine" node and have seperate socket types for both).
- The `Combine` node has its own benefits, including previewability of "only structures". Multi-input would mainly be a kind of shorthand in specific cases (like input to the `Combine` node?)
- [ ] Prepare for volume geonodes (bl4.2; July 16, 2024)
- Will allow for actual volume processing in GeoNodes.
- We might still want/need the jax based stuff after; volume geonodes aren't finalized.
## Packaging
- [ ] Test lockfile platform-agnosticism on Windows
## BLCache
- [ ] Replace every raw property with `BLField`.
- [x] Add matrix property support: https://developer.blender.org/docs/release_notes/3.0/python_api/#other-additions
- [ ] Fix many problems by persisting `_enum_cb_cache` and `_str_cb_cache`.
- [ ] Docstring parser for descriptions.
- [ ] Method of dynamically setting property options after creation, using `idproperty_ui_data`
# BUGS
We're trying to do our part by reporting bugs we find!
This is where we keep track of them for now, if they're not covered by the above listings.
## Blender Maxwell Bugs
See Issues.
## Testing
- [ ] `pytest` integration exhibits a bootstrapping problem when using https://github.com/mondeja/pytest-blender
## Blender Bugs
Reported:
- (SOLVED) <https://projects.blender.org/blender/blender/issues/119664>
Unreported:
- Units are unruly, and are entirely useless when it comes to going small like this.
- The `__mp_main__` bug.
- Animated properties within custom node trees don't update with the frame. See: <https://projects.blender.org/blender/blender/issues/66392>
- Can't update `items` using `id_properties_ui` of `EnumProperty`. Maybe less a bug than an annoyance.
- **Matrix Display Bug**: The data given to matrix properties is entirely ignored in the UI; the data is flattened, then left-to-right, up-to-down, the data is inserted. It's neither row-major nor column-major - it's completely flat.
- Though, if one wanted row-major (**as is consistent with `mathutils.Matrix`**), one would be disappointed - the UI prints the matrix property column-major
- Trying to set the matrix property with a `mathutils.Matrix` is even stranger - firstly, the size of the `mathutils.Matrix` must be transposed with respect to the property size (again the col/row major mismatch). But secondly, even when accounting for the col/row major mismatch, the values of a ex. 2x3 (row-major) matrix (written to with a 3x2 matrix with same flattened sequence) is written in a very strange order:
- Write `mathutils.Matrix` `[[0,1], [2,3], [4,10]]`: Results in (UI displayed row-major) `[[0,3], [4,1], [3,5]]`
- **Workaround (write)**: Simply flatten the 2D array, re-shape by `[cols,rows]`. The UI will display as the original array. `myarray.flatten().reshape([cols,rows])`.
- **Workaround (read)**: `np.array([[el1 for el1 in el0] for el0 in BLENDER_OBJ.matrix_prop]).flatten().reshape([rows,cols])`. Simply flatten the property read 2D array and re-shape by `[rows,cols]`. Mind that data type out is equal to data type in.
- Also, for bool matrices, `toggle=True` has no effect. `alignment='CENTER'` also doesn't align the checkboxes in their cells.
## Tidy3D bugs
Unreported:
- Directly running `SimulationTask.get()` is missing fields - it doesn't return some fields, including `created_at`. Listing tasks by folder is not broken.
- Frequency ranges don't check for repeated elements.
# Designs / Proposals
## Coolness Things
- Let's have operator `poll_message_set`: https://projects.blender.org/blender/blender/commit/ebe04bd3cafaa1f88bd51eee5b3e7bef38ae69bc
- Careful, Python uses user site packages: <https://projects.blender.org/blender/blender/commit/72c012ab4a3d2a7f7f59334f4912402338c82e3c>
- Our modifier obj can see execution time: <https://projects.blender.org/blender/blender/commit/8adebaeb7c3c663ec775fda239fdfe5ddb654b06>
## IDEAS
- [ ] Depedencies-gated addon preferences.
- [ ] Preferences-based specification/alteration of default unit systems for Tidy3D and Blender.
- [ ] Preferences-based specification/alteration of Tidy3D API key, so we can factor away all the `prelock` bullshit.
- [ ] Subsockets
- We need Exprs to not be so picky.
- All the sympy-making nodes should be subsockets of Expr, so that you can plug any socket that should work with Expr into Expr.
- When it comes to Data, any Expr that produces an array-like output from its `LazyValueFunc` should be deemed compatible (as in, the Expr may plug into a Data socket).
- Specifically, that means the presence of a well-defined `Info`, as well as `jax` compatibility.
- [ ] Symbolic Expr Socket
- [ ] Nodes should be able to dynamically define new symbols on their Expr sockets.
- [ ] Expr's `FlowKind`s should be expanded:
- [ ] `Capabilities`: Expand to include subsocket checking, where Expr is the supersocket of ex. most/all of the physical, numerical, vector sockets.
- [ ] `Value`: Just the raw sympy expression, when `active_kind` is `Value`.
- [ ] `Array`: The evaluated `LazyValueFunc`, when `active_kind` is `Array`.
- Should require that the expression as a whole simplifies to `sp.Matrix`.
- Should require that there are no symbols to be defined in a socket (since `LazyValueFunc` must be called with no args).
- [ ] `LazyValueFunc`: Create a 'jax' function from an expression, such that each symbol becomes an argument to that function.
- When `active_kind` is `Value`, it should take arrays/scalars and return a scalar (expression output is a normal sympy number of some kind).
- When `active_kind` is `Array`, it should take arrays/scalars and return an array (expression output is `sp.Matrix`).
- This kind of approach allows using
- [ ] `LazyValueRange`: Expose two expressions, start/end, but with one symbol set.
- [ ] `Info`: Should always produce an `InfoFlow` that, at minimum, has an empty `dim_*`, an `output_shape` of `None`, etc., for a scalar.
- [ ] Implement an Expr Constant node to see all this through in prototype.
- [ ] Expr: Obviously, input and output.
- [ ] Symbols: Node-bound dynamic thing where you can add and subtract symbols, as well as set their type. They should popup in the `Let:` statement of the input expr socket.
- [ ] Examples: Each symbol should have the ability to set "example values", which causes the Node to fill `Params`. When all
- [ ] Report reason for no-link using `self.report`.
- [ ] Dropping a link on empty space should query a menu of possible nodes, or if only one node is reasonable, make that node.
- [ ] Shader visualizations approximated from medium `nk` into a shader node graph, aka. a generic BSDF.
- [ ] Easy conversion of lazyarrayrange to mu/sigma frequency for easy computation of pulse fits from data.
- [ ] IDEA: Hand-craft a faster `spu.convert_to`. <https://github.com/sympy/sympy/blob/a44299273eeb4838beaee9af3b688f2f44d7702f/sympy/physics/units/util.py#L51-L129>
- [ ] We should probably communicate with the `sympy` upstream about our deep usage of unit systems. They might be interested in the various workarounds :)

View File

@ -7,7 +7,7 @@ authors = [
]
dependencies = [
"tidy3d>=2.6.3",
"pydantic==2.6.*",
"pydantic>=2.7.1",
"sympy==1.12",
"scipy==1.12.*",
"trimesh==4.2.*",
@ -21,12 +21,16 @@ dependencies = [
# Pin Blender 4.1.0-Compatible Versions
## The dependency resolver will report if anything is wonky.
"urllib3==1.26.8",
"requests==2.27.1",
#"requests==2.27.1", ## Conflict with dev-dep commitizen
"numpy==1.24.3",
"idna==3.3",
"charset-normalizer==2.0.10",
#"charset-normalizer==2.0.10", ## Conflict with dev-dep commitizen
"certifi==2021.10.8",
]
## When it comes to dev-dep conflicts:
## -> It's okay to leave Blender-pinned deps out of prod; Blender still has them.
## -> In edge cases, other deps might grab newer versions and Blender will complain.
## -> Let's wait and see if this is more than a theoretical issue.
readme = "README.md"
requires-python = "~= 3.11"
license = { text = "AGPL-3.0-or-later" }
@ -38,9 +42,16 @@ license = { text = "AGPL-3.0-or-later" }
managed = true
virtual = true
dev-dependencies = [
"ruff>=0.3.2",
"ruff>=0.4.3",
"fake-bpy-module-4-0>=20231118",
## TODO: Update to Blender 4.1.0
## TODO: Blender 4.1 (when available)
"pre-commit>=3.7.0",
"commitizen>=3.25.0",
## Requires charset-normalizer>=2.1.0
# Required by Commitizen
## -> It's okay to have different dev/prod versions in our use case.
"charset-normalizer==2.1.*",
## Manually scanned CHANGELOG; seems compatible.
]
[tool.rye.scripts]
@ -146,7 +157,19 @@ indent-style = "tab"
docstring-code-format = false
####################
# - Tooling: Pytest
# - Tooling: Commits
####################
#[tool.pytest.ini_options]
[tool.commitizen]
# Specification
name = "cz_conventional_commits"
version_scheme = "semver2"
version_provider = "pep621"
tag_format = "v$version"
# Version Bumping
major_version_zero = true
update_changelog_on_bump = true
# Annotations / Signature
gpg_sign = true
annotated_tag = true

View File

@ -9,6 +9,8 @@
annotated-types==0.6.0
# via pydantic
argcomplete==3.3.0
# via commitizen
boto3==1.23.1
# via tidy3d
botocore==1.26.10
@ -16,13 +18,19 @@ botocore==1.26.10
# via s3transfer
certifi==2021.10.8
# via requests
charset-normalizer==2.0.10
cfgv==3.4.0
# via pre-commit
charset-normalizer==2.1.1
# via commitizen
# via requests
click==8.0.3
# via dask
# via tidy3d
cloudpickle==3.0.0
# via dask
colorama==0.4.6
# via commitizen
commitizen==3.25.0
commonmark==0.9.1
# via rich
contourpy==1.2.0
@ -31,7 +39,13 @@ cycler==0.12.1
# via matplotlib
dask==2023.10.1
# via tidy3d
decli==0.6.2
# via commitizen
distlib==0.3.8
# via virtualenv
fake-bpy-module-4-0==20231118
filelock==3.14.0
# via virtualenv
fonttools==4.49.0
# via matplotlib
fsspec==2024.2.0
@ -41,15 +55,20 @@ h5netcdf==1.0.2
h5py==3.10.0
# via h5netcdf
# via tidy3d
identify==2.5.36
# via pre-commit
idna==3.3
# via requests
importlib-metadata==6.11.0
# via commitizen
# via dask
# via tidy3d
jax==0.4.26
jaxlib==0.4.26
# via jax
jaxtyping==0.2.28
jinja2==3.1.3
# via commitizen
jmespath==1.0.1
# via boto3
# via botocore
@ -59,6 +78,8 @@ llvmlite==0.42.0
# via numba
locket==1.0.0
# via partd
markupsafe==2.1.5
# via jinja2
matplotlib==3.8.3
# via tidy3d
ml-dtypes==0.4.0
@ -68,6 +89,8 @@ mpmath==1.3.0
# via sympy
msgspec==0.18.6
networkx==3.2
nodeenv==1.8.0
# via pre-commit
numba==0.59.1
numpy==1.24.3
# via contourpy
@ -87,6 +110,7 @@ numpy==1.24.3
opt-einsum==3.3.0
# via jax
packaging==24.0
# via commitizen
# via dask
# via h5netcdf
# via matplotlib
@ -97,9 +121,14 @@ partd==1.4.1
# via dask
pillow==10.2.0
# via matplotlib
pydantic==2.6.0
platformdirs==4.2.1
# via virtualenv
pre-commit==3.7.0
prompt-toolkit==3.0.36
# via questionary
pydantic==2.7.1
# via tidy3d
pydantic-core==2.16.1
pydantic-core==2.18.2
# via pydantic
pygments==2.17.2
# via rich
@ -116,10 +145,14 @@ python-dateutil==2.9.0.post0
pytz==2024.1
# via pandas
pyyaml==6.0.1
# via commitizen
# via dask
# via pre-commit
# via responses
# via tidy3d
requests==2.27.1
questionary==2.0.1
# via commitizen
requests==2.31.0
# via responses
# via tidy3d
responses==0.23.1
@ -127,23 +160,29 @@ responses==0.23.1
rich==12.5.0
# via tidy3d
rtree==1.2.0
ruff==0.3.2
ruff==0.4.3
s3transfer==0.5.2
# via boto3
scipy==1.12.0
# via jax
# via jaxlib
# via tidy3d
setuptools==69.5.1
# via nodeenv
shapely==2.0.3
# via tidy3d
six==1.16.0
# via python-dateutil
sympy==1.12
termcolor==2.4.0
# via commitizen
tidy3d==2.6.3
toml==0.10.2
# via tidy3d
tomli-w==1.0.0
# via msgspec
tomlkit==0.12.4
# via commitizen
toolz==0.12.1
# via dask
# via partd
@ -161,6 +200,10 @@ urllib3==1.26.8
# via botocore
# via requests
# via responses
virtualenv==20.26.1
# via pre-commit
wcwidth==0.2.13
# via prompt-toolkit
xarray==2024.2.0
# via tidy3d
zipp==3.18.0

View File

@ -96,9 +96,9 @@ partd==1.4.1
# via dask
pillow==10.2.0
# via matplotlib
pydantic==2.6.0
pydantic==2.7.1
# via tidy3d
pydantic-core==2.16.1
pydantic-core==2.18.2
# via pydantic
pygments==2.17.2
# via rich

View File

@ -139,7 +139,7 @@ BLOperatorStatus: typ.TypeAlias = set[
####################
# - Addon Types
####################
KeymapItemDef: typ.TypeAlias = typ.Any ## TODO: Better Type
KeymapItemDef: typ.TypeAlias = typ.Any
ManagedObjName = str
####################

View File

@ -1,4 +1,3 @@
from . import categories, node_tree, nodes, sockets
BL_REGISTER = [

View File

@ -5,11 +5,8 @@ Attributes:
BL_SOCKET_4D_TYPE_PREFIXES: Blender socket prefixes which indicate that the Blender socket has four values.
"""
import typing as typ
import bpy
from blender_maxwell.utils import extra_sympy_units as spux
from blender_maxwell.utils import logger as _logger
from . import contracts as ct

View File

@ -81,11 +81,9 @@ BL_NODE_CATEGORIES = mk_node_categories(
ct.NodeCategory.get_tree()['MAXWELLSIM'],
syllable_prefix=['MAXWELLSIM'],
)
## TODO: refactor, this has a big code smell
BL_REGISTER = [*DYNAMIC_SUBMENU_REGISTRATIONS] ## Must be run after, right now.
## TEST - TODO this is a big code smell
def menu_draw(self, context):
if context.space_data.tree_type == ct.TreeType.MaxwellSim.value:
for nodeitem_or_submenu in BL_NODE_CATEGORIES:

View File

@ -1,25 +1,25 @@
from blender_maxwell.contracts import (
BLClass,
BLColorRGBA,
BLEnumElement,
BLEnumID,
BLIcon,
BLIconSet,
BLIDStruct,
BLKeymapItem,
BLModifierType,
BLNodeTreeInterfaceID,
BLOperatorStatus,
BLPropFlag,
BLRegionType,
BLSpaceType,
KeymapItemDef,
ManagedObjName,
OperatorType,
PanelType,
PresetName,
SocketName,
addon,
BLClass,
BLColorRGBA,
BLEnumElement,
BLEnumID,
BLIcon,
BLIconSet,
BLIDStruct,
BLKeymapItem,
BLModifierType,
BLNodeTreeInterfaceID,
BLOperatorStatus,
BLPropFlag,
BLRegionType,
BLSpaceType,
KeymapItemDef,
ManagedObjName,
OperatorType,
PanelType,
PresetName,
SocketName,
addon,
)
from .bl_socket_types import BLSocketInfo, BLSocketType
@ -27,14 +27,14 @@ from .category_labels import NODE_CAT_LABELS
from .category_types import NodeCategory
from .flow_events import FlowEvent
from .flow_kinds import (
ArrayFlow,
CapabilitiesFlow,
FlowKind,
InfoFlow,
LazyArrayRangeFlow,
LazyValueFuncFlow,
ParamsFlow,
ValueFlow,
ArrayFlow,
CapabilitiesFlow,
FlowKind,
InfoFlow,
LazyArrayRangeFlow,
LazyValueFuncFlow,
ParamsFlow,
ValueFlow,
)
from .flow_signals import FlowSignal
from .icons import Icon

View File

@ -1,7 +1,6 @@
import enum
import typing as typ
_FLOW_SIGNAL_SET: set | None = None

View File

@ -11,10 +11,10 @@ class NodeType(blender_type_enum.BlenderTypeEnum):
ExtractData = enum.auto()
Viz = enum.auto()
## Analysis / Math
OperateMath = enum.auto()
MapMath = enum.auto()
FilterMath = enum.auto()
ReduceMath = enum.auto()
OperateMath = enum.auto()
TransformMath = enum.auto()
# Inputs
@ -22,9 +22,8 @@ class NodeType(blender_type_enum.BlenderTypeEnum):
Scene = enum.auto()
## Inputs / Constants
ExprConstant = enum.auto()
PhysicalConstant = enum.auto()
NumberConstant = enum.auto()
VectorConstant = enum.auto()
PhysicalConstant = enum.auto()
ScientificConstant = enum.auto()
UnitSystemConstant = enum.auto()
BlenderConstant = enum.auto()
@ -104,8 +103,9 @@ class NodeType(blender_type_enum.BlenderTypeEnum):
KSpaceNearFieldProjectionMonitor = enum.auto()
# Sims
FDTDSim = enum.auto()
Combine = enum.auto()
SimDomain = enum.auto()
FDTDSim = enum.auto()
SimGrid = enum.auto()
## Sims / Sim Grid Axis
AutomaticSimGridAxis = enum.auto()
@ -114,5 +114,4 @@ class NodeType(blender_type_enum.BlenderTypeEnum):
ArraySimGridAxis = enum.auto()
# Utilities
Combine = enum.auto()
Separate = enum.auto()

View File

@ -1,4 +1,3 @@
from .base import ManagedObj
# from .managed_bl_empty import ManagedBLEmpty

View File

@ -1,4 +1,3 @@
import bpy
from blender_maxwell.utils import logger

View File

@ -1,6 +1,6 @@
"""Declares `ManagedBLImage`."""
#import time
# import time
import typing as typ
import bpy

View File

@ -6,7 +6,6 @@ import jax.numpy as jnp
import sympy as sp
from blender_maxwell.utils import bl_cache, logger
from blender_maxwell.utils import extra_sympy_units as spux
from .... import contracts as ct
from .... import sockets

View File

@ -5,11 +5,8 @@ import typing as typ
import bpy
import jax
import jax.numpy as jnp
import sympy as sp
from blender_maxwell.utils import bl_cache, logger
from blender_maxwell.utils import extra_sympy_units as spux
from .... import contracts as ct
from .... import sockets

View File

@ -144,6 +144,4 @@ class AdiabAbsorbBoundCondNode(base.MaxwellSimNode):
BL_REGISTER = [
AdiabAbsorbBoundCondNode,
]
BL_NODES = {
ct.NodeType.AdiabAbsorbBoundCond: (ct.NodeCategory.MAXWELLSIM_BOUNDS)
}
BL_NODES = {ct.NodeType.AdiabAbsorbBoundCond: (ct.NodeCategory.MAXWELLSIM_BOUNDS)}

View File

@ -1,23 +1,23 @@
from . import (
constants,
file_importers,
#unit_system,
wave_constant,
web_importers,
constants,
file_importers,
# unit_system,
wave_constant,
web_importers,
)
# from . import file_importers
BL_REGISTER = [
*wave_constant.BL_REGISTER,
#*unit_system.BL_REGISTER,
# *unit_system.BL_REGISTER,
*constants.BL_REGISTER,
*web_importers.BL_REGISTER,
*file_importers.BL_REGISTER,
]
BL_NODES = {
**wave_constant.BL_NODES,
#**unit_system.BL_NODES,
# **unit_system.BL_NODES,
**constants.BL_NODES,
**web_importers.BL_NODES,
**file_importers.BL_NODES,

View File

@ -1,9 +1,9 @@
from . import (
blender_constant,
expr_constant,
number_constant,
physical_constant,
scientific_constant,
blender_constant,
expr_constant,
number_constant,
physical_constant,
scientific_constant,
)
BL_REGISTER = [

View File

@ -1,4 +1,3 @@
import enum
import typing as typ
import bpy

View File

@ -1,8 +1,8 @@
#from . import tidy_3d_web_importer
# from . import tidy_3d_web_importer
BL_REGISTER = [
#*tidy_3d_web_importer.BL_REGISTER,
# *tidy_3d_web_importer.BL_REGISTER,
]
BL_NODES = {
#**tidy_3d_web_importer.BL_NODES,
# **tidy_3d_web_importer.BL_NODES,
}

View File

@ -1,154 +0,0 @@
import typing as typ
from pathlib import Path
import bpy
import tidy3d as td
from blender_maxwell.services import tdcloud
from blender_maxwell.utils import bl_cache, logger
from .... import contracts as ct
from .... import sockets
from ... import base, events
log = logger.get(__name__)
class LoadCloudSim(bpy.types.Operator):
bl_idname = ct.OperatorType.NodeLoadCloudSim
bl_label = '(Re)Load Sim'
bl_description = '(Re)Load simulation data associated with the attached cloud task'
@classmethod
def poll(cls, context):
return (
# Node Type
hasattr(context, 'node')
and hasattr(context.node, 'node_type')
and context.node.node_type == ct.NodeType.Tidy3DWebImporter
# Cloud Status
and tdcloud.IS_ONLINE
and tdcloud.IS_AUTHENTICATED
)
def execute(self, context):
node = context.node
# Try Loading Simulation Data
# node.sim_data = bl_cache.Signal.InvalidateCache
sim_data = node.sim_data
if sim_data is None:
self.report(
{'ERROR'},
'Sim Data could not be loaded. Check your network connection.',
)
else:
self.report({'INFO'}, 'Sim Data loaded.')
return {'FINISHED'}
####################
# - Node
####################
class Tidy3DWebImporterNode(base.MaxwellSimNode):
node_type = ct.NodeType.Tidy3DWebImporter
bl_label = 'Tidy3D Web Importer'
input_sockets: typ.ClassVar = {
'Cloud Task': sockets.Tidy3DCloudTaskSocketDef(
should_exist=True,
),
}
output_sockets: typ.ClassVar = {
'Sim Data': sockets.MaxwellFDTDSimDataSocketDef(),
}
####################
# - Properties
####################
sim_data_loaded: bool = bl_cache.BLField(False)
####################
# - Computed
####################
@property
def sim_data(self) -> td.SimulationData | None:
cloud_task = self._compute_input(
'Cloud Task', kind=ct.FlowKind.Value, optional=True
)
has_cloud_task = not ct.FlowSignal.check(cloud_task)
if (
has_cloud_task
and cloud_task is not None
and isinstance(cloud_task, tdcloud.CloudTask)
and cloud_task.status == 'success'
):
sim_data = tdcloud.TidyCloudTasks.download_task_sim_data(
cloud_task, _sim_data_cache_path(cloud_task.task_id)
)
self.sim_data_loaded = True
return sim_data
return None
####################
# - UI
####################
def draw_operators(self, _: bpy.types.Context, layout: bpy.types.UILayout):
if self.sim_data_loaded:
layout.operator(ct.OperatorType.NodeLoadCloudSim, text='Reload Sim')
else:
layout.operator(ct.OperatorType.NodeLoadCloudSim, text='Load Sim')
####################
# - Events
####################
@events.on_value_changed(
prop_name='sim_data_loaded', run_on_init=True, props={'sim_data_loaded'}
)
def on_cloud_task_changed(self, props: dict):
if props['sim_data_loaded']:
if not self.loose_output_sockets:
self.loose_output_sockets = {
'Sim Data': sockets.MaxwellFDTDSimDataSocketDef(),
}
elif self.loose_output_sockets:
self.loose_output_sockets = {}
####################
# - Output
####################
@events.computes_output_socket(
'Sim Data',
props={'sim_data_loaded'},
input_sockets={'Cloud Task'},
)
def compute_sim_data(self, props: dict, input_sockets: dict) -> str:
if props['sim_data_loaded']:
cloud_task = input_sockets['Cloud Task']
if (
# Check Flow
not ct.FlowSignal.check(cloud_task)
# Check Task
and cloud_task is not None
and isinstance(cloud_task, tdcloud.CloudTask)
and cloud_task.status == 'success'
):
return self.sim_data
return ct.FlowSignal.FlowPending
return ct.FlowSignal.FlowPending
####################
# - Blender Registration
####################
BL_REGISTER = [
LoadCloudSim,
Tidy3DWebImporterNode,
]
BL_NODES = {
ct.NodeType.Tidy3DWebImporter: (ct.NodeCategory.MAXWELLSIM_INPUTS_WEBIMPORTERS)
}

View File

@ -40,8 +40,6 @@ class EHFieldMonitorNode(base.MaxwellSimNode):
mathtype=spux.MathType.Integer,
default_value=sp.Matrix([10, 10, 10]),
),
## TODO: Pass a grid instead of size and resolution
## TODO: 1D (line), 2D (plane), 3D modes
}
input_socket_sets: typ.ClassVar = {
'Freq Domain': {

View File

@ -1,6 +1,6 @@
# from . import sim_grid
# from . import sim_grid_axes
from . import fdtd_sim, sim_domain, combine
from . import combine, fdtd_sim, sim_domain
BL_REGISTER = [
*combine.BL_REGISTER,

View File

@ -4,8 +4,6 @@ from . import (
# plane_wave_source,
point_dipole_source,
temporal_shapes,
# tfsf_source,
# uniform_current_source,
)
BL_REGISTER = [

View File

@ -4,7 +4,6 @@ import sympy as sp
import sympy.physics.units as spu
import tidy3d as td
from blender_maxwell.assets.geonodes import GeoNodes, import_geonodes
from blender_maxwell.utils import extra_sympy_units as spux
from blender_maxwell.utils import logger

View File

@ -831,8 +831,9 @@ class MaxwellSimSocket(bpy.types.NodeSocket):
col = row.column(align=True)
{
ct.FlowKind.Value: self.draw_value,
ct.FlowKind.LazyArrayRange: self.draw_lazy_array_range,
ct.FlowKind.Array: self.draw_array,
ct.FlowKind.LazyArrayRange: self.draw_lazy_array_range,
ct.FlowKind.LazyValueFunc: self.draw_lazy_value_func,
}[self.active_kind](col)
# Info Drawing

View File

@ -57,6 +57,3 @@ class StringSocketDef(base.SocketDef):
BL_REGISTER = [
StringBLSocket,
]

View File

@ -13,7 +13,6 @@ from . import base
## TODO: This is a big node, and there's a lot to get right.
## - Dynamically adjust socket color in response to, especially, the unit dimension.
## - Iron out the meaning of display shapes.
## - Generally pay attention to validity checking; it's make or break.
## - For array generation, it may pay to have both a symbolic expression (producing output according to `shape` as usual) denoting how to actually make values, and how many. Enables ex. easy symbolic plots.

View File

@ -1,5 +1,3 @@
import typing as typ
import bpy
import tidy3d as td

View File

@ -1,4 +1,3 @@
from ... import contracts as ct
from .. import base

View File

@ -1,4 +1,3 @@
from ... import contracts as ct
from .. import base

View File

@ -1,4 +1,3 @@
from ... import contracts as ct
from .. import base

View File

@ -19,7 +19,9 @@ class MaxwellSimGridBLSocket(base.MaxwellSimSocket):
min=0.01,
# step=10,
precision=2,
update=(lambda self, context: self.on_prop_changed('min_steps_per_wl', context)),
update=(
lambda self, context: self.on_prop_changed('min_steps_per_wl', context)
),
)
####################

View File

@ -1,4 +1,3 @@
from ... import contracts as ct
from .. import base

View File

@ -15,8 +15,6 @@ class MaxwellSourceSocketDef(base.SocketDef):
is_list: bool = False
## TODO: capabilities() to require source sockets
def init(self, bl_socket: MaxwellSourceBLSocket) -> None:
if self.is_list:
bl_socket.active_kind = ct.FlowKind.Array

View File

@ -1,4 +1,3 @@
from ... import contracts as ct
from .. import base

View File

@ -1,7 +1,6 @@
import bpy
#from blender_maxwell.utils.pydantic_sympy import SympyExpr
# from blender_maxwell.utils.pydantic_sympy import SympyExpr
from ... import contracts as ct
from .. import base
@ -137,14 +136,18 @@ class PhysicalUnitSystemBLSocket(base.MaxwellSimSocket):
description='Unit of acceleration',
items=contract_units_to_items(ST.PhysicalAccelScalar),
default=default_unit_key_for(ST.PhysicalAccelScalar),
update=(lambda self, context: self.on_prop_changed('unit_accel_scalar', context)),
update=(
lambda self, context: self.on_prop_changed('unit_accel_scalar', context)
),
)
unit_force_scalar: bpy.props.EnumProperty(
name='Force Scalar Unit',
description='Unit of scalar force',
items=contract_units_to_items(ST.PhysicalForceScalar),
default=default_unit_key_for(ST.PhysicalForceScalar),
update=(lambda self, context: self.on_prop_changed('unit_force_scalar', context)),
update=(
lambda self, context: self.on_prop_changed('unit_force_scalar', context)
),
)
unit_accel_3d: bpy.props.EnumProperty(
name='Accel3D Unit',

View File

@ -1,4 +1,4 @@
from . import install_deps, uninstall_deps, manage_pydeps
from . import install_deps, manage_pydeps, uninstall_deps
BL_REGISTER = [
*install_deps.BL_REGISTER,

View File

@ -1,5 +1,3 @@
import subprocess
import sys
from pathlib import Path
import bpy

View File

@ -2,7 +2,6 @@ import os
import re
import subprocess
import sys
import time
from pathlib import Path
from . import pydeps, simple_logger
@ -73,7 +72,7 @@ def returncode() -> bool:
def kill() -> None:
global PROCESS # noqa: PLW0603
global PROCESS
if not is_running():
msg = "Can't kill process that isn't running"

View File

@ -1,7 +1,6 @@
"""Tools for fearless managemenet of addon-specific Python dependencies."""
import contextlib
import functools
import importlib.metadata
import os
import sys

View File

@ -323,7 +323,7 @@ class CachedBLProperty:
"Can't Get CachedBLProperty: Instance ID not (yet) defined on BLInstance %s",
str(bl_instance),
)
return
return None
# Create Non-Persistent Cache Entry
## Prefer explicit cache management to 'defaultdict'

View File

@ -126,7 +126,7 @@ def mpl_fig_canvas_ax(width_inches: float, height_inches: float, dpi: int):
ax = fig.add_subplot()
# The Customer is Always Right (in Matters of Taste)
#fig.tight_layout(pad=0)
# fig.tight_layout(pad=0)
return (fig, canvas, ax)
@ -250,7 +250,7 @@ def plot_heatmap_2d(
y_unit = info.dim_units[y_name]
heatmap = ax.imshow(data, aspect='auto', interpolation='none')
#ax.figure.colorbar(heatmap, ax=ax)
# ax.figure.colorbar(heatmap, ax=ax)
ax.set_title('Heatmap')
ax.set_xlabel(f'{x_name}' + (f'({x_unit})' if x_unit is not None else ''))
ax.set_ylabel(f'{y_name}' + (f'({y_unit})' if y_unit is not None else ''))

View File

@ -0,0 +1,7 @@
import numpy as np
print('Imported Addon 1 w/np.__file__:', np.__file__)
def np_file_addon_1():
return np.__file__

View File

@ -0,0 +1,7 @@
import numpy as np
print('Imported Addon 2 w/np.__file__:', np.__file__)
def np_file_addon_2():
return np.__file__

742
src/ienv/ienv.py 100644
View File

@ -0,0 +1,742 @@
"""Interpreter-integrated ENVironments - like 'venv', but in one same Python process!
Ever wanted to **robustly** use two subpackages, with their own dependencies, in the same process?
Now you can, by letting the package be an 'ienv'!
The cost is your soul, of course.
Well, a lightly customized `builtins.__import__`, but isn't that the same kind of deal?
# Example
Let's presume you've setup your project structure something like this:
```
main.py <-- Run this (`ienv` must be importable).
children/
.. child1/ <-- This is an IEnv
.. child1/.ienv-deps
.. child1/__init__.py
.. child2/ <-- This is also an IEnv
.. child2/.ienv-deps
.. child2/__init__.py
```
Say you want to run the following `main.py`, which prints out the `__file__` attribute of `numpy` imported in each:
```
from children import child1
from children import child2
print('Addon 1 Function: np.__file__:', addon_1.np_file_addon_1())
print('Addon 2 Function: np.__file__:', addon_2.np_file_addon_2())
```
However, your boss says that:
- `child1` **must** use `numpy==1.24.4`
- `child2` **must** use `numpy==1.26.1`
Generally, this would be impossible.
But that's where IEnv comes in.
### Installing `ienv.py`
It's the usual story: As long as `main.py` can `import ienv`, you're set.
Some ideas:
- A `venv`: This is the recommended setup.
- The same folder as `main.py`: If you run `python ./main.py`, then you're set.
- Any `sys.path` Folder: The general case.
`ienv.py` has no dependencies, so it should be perfectly portable to all kinds of weird setups.
### Installing Dependencies to the IEnvs
Let's quickly install numpy on each.
- `python -m pip install --target child1/.ienv numpy==1.24.4`.
- `python -m pip install --target child2/.ienv numpy==1.26.1`.
**Make sure to use the same `python` as you'll be running `main.py` with.**
_You could also do this from within `main.py`, with the help of `subprocess.run`._
### Run `main.py`
To run the main.py, we just need to add a little snippet above everything else:
```
import ienv
from pathlib import Path
ienv_base_path = Path(__file__).resolve().parent / 'children'
ienv.init(ienv_base_path)
...
```
Now, when you run `main.py`, you should see a very pro
# IEnv Semantics
**An "IEnv" is a Python package with its own dependencies.**
What's special is that **IEnvs can share a process without sharing dependencies**.
This all happens without the code in the IEnv having to do anything special.
## Classification
To be classified as an IEnv, a Python module:
- **MUST** be a valid Python package, with an `__init__.py`.
- **CANNOT** be the entrypoint of the program.
- **MUST** be imported from a context where `ienv.init()` has been run.
- **MUST** be a subfolder of the `ienv_base_path` passed as an argument to the latest run of `ienv.init()`.
- **MUST** have a subfolder named `.ienv-deps`, which only contains Python modules (incl. packages).
## General Behavior
From any module in IEnv (or the IEnv itself), `import` will now work slightly differently:
- `import` will prioritize searching `.ienv-deps` (and can be configured to reject other sources).
- If a module is found in `.ienv-deps`, the `sys.modules` module name will have an IEnv-specific prefix.
- `import` will always check `sys.modules` using the IEnv-prefixed name.
It's just as important what `ienv` **DOES NOT** do:
- All `stdlib` imports are passed through to the builtin `__import__`.
- The user may also specify modules to always pass through.
- The performance properties of `sys.modules` are completely preserved, even within IEnvs.
# Gotchas
There are some **gotchas** you must make peace with if you want to use IEnvs.
## No Dynamic imports
**Dynamic imports are not guaranteed available after `ienv.free()` has run.**
Don't use them!
Note that:
- They're generally a bad idea anyway, as **the import semantics of dynamic contexts cannot be statically known**.
- If your program never runs `ienv.free()`, then dynamic imports will work just fine.
## Not Portable
**`pip install`ed packages can never be presumed portable across operating systems**.
As a result, IEnvs are not generally copy-pasteable to other folders or operating systems.
Note that:
- If you're certain that no dependencies will break by being moved from their install dir, then the IEnv can be moved.
- If, also, all dependencies are cross-platform, then the IEnv can be copied to other platforms.
Python modules, being very dynamic, may have undefined behavior in response to being moved.
"""
import builtins
import dataclasses
import enum
import functools
import importlib
import importlib.abc
import importlib.machinery
import importlib.util ## This should already make you concerned :)
import os
import re
import sys
import types
import typing as typ
from pathlib import Path
builtins__import = __import__
####################
# - Types
####################
ValidDirName: typ.TypeAlias = str
PathLikeStr: typ.TypeAlias = str
ModuleNamePrefix: typ.TypeAlias = str
ModuleName: typ.TypeAlias = str
AbsoluteModuleName: typ.TypeAlias = str
IEnvName: typ.TypeAlias = str
####################
# - IEnv Constants
####################
_USE_CPYTHON_MODULE_SUFFIX_PRECEDENCE: bool = False
_IENV_PREFIX: ModuleNamePrefix = '_ienv_'
_IENV_DEPS_DIRNAME: ValidDirName = '.ienv-deps'
IENV_BASE_PATH: Path | None = None
ALWAYS_PASSTHROUGH: set[ModuleName] | None = None
####################
# - IEnv Analysis Functions
####################
@functools.cache
def is_in_ienv(caller_path_str: PathLikeStr) -> bool:
return IENV_BASE_PATH in Path(caller_path_str).parents
@functools.cache
def compute_ienv_name(caller_path: Path) -> IEnvName:
if not is_in_ienv(os.fspath(caller_path)): ## Reuse @cache by stringifying Path
msg = f'Attempted to import an IEnv, but caller ({caller_path}) is not in the IENV_BASE_PATH ({IENV_BASE_PATH})'
raise ImportError(msg)
return caller_path.relative_to(IENV_BASE_PATH).parts[0]
@functools.cache
def compute_ienv_path(ienv_name: IEnvName) -> Path:
return IENV_BASE_PATH / ienv_name
@functools.cache
def compute_ienv_deps_path(ienv_name: IEnvName) -> Path:
return IENV_BASE_PATH / ienv_name / _IENV_DEPS_DIRNAME
@functools.cache
def compute_ienv_module_prefix(ienv_name: IEnvName) -> ModuleNamePrefix:
return _IENV_PREFIX + f'{ienv_name}__'
@functools.cache
def match_ienv_module_name(ienv_module_name: AbsoluteModuleName) -> re.Match | None:
return re.match(r'^_ienv_(?P<ienv_name>[a-z0-9\_-]+)__', ienv_module_name)
####################
# - IEnv __import__
####################
def import_ienv(
name: str,
_globals: dict[str, typ.Any] | None = None,
_locals: dict[str, typ.Any] | None = None,
fromlist: tuple[str, ...] = (),
level: int = 0,
) -> types.ModuleType:
"""Imports an `ienv`, using the same context provided to `__import__`.
# Semantics
This function is designed to be called from a replaced `builtins.__import__`.
Thus, its semantics are identical to `__import__`, but differs in exactly two subtle ways.
**Namespaced `sys.modules` Lookup**
- Usually, `import name` will lookup 'name' in `sys.modules`.
- Now, `import name` will lookup '_ienv_<ienv_name>__<name>' in `sys.modules`.
**Namespaced `sys.modules` Assignment**
- Usually, `import name` -> `sys.modules['name']`.
- Now, `import name` -> `sys.modules['_ienv_<ienv_name>__<name>']`
## Relationship to `sys.meta_path` Finder
Strictly speaking, the second one (**Assignment**) is performed by a complementary `sys.meta_path` finder.
However, this finder only triggers when `builtins.__import__` is called with a specially-prefixed name.
This function automates the preparation of this specially-prefixed name.
Arguments:
name: The name of the module to import.
_globals: The `globals()` dictionary from where `import` was called.
This is used to decide which module to import and return.
_locals: The `globals()` dictionary from where `import` was called.
As with `builtins.__import__`, it must be defined, but it is not used.
It is included here (and passed on) to match these semantics.
fromlist: Names to guarantee available in the returned module.
For each `attr in fromlist`, it must be possible to call `mod.attr` on the returned module `mod`.
level: The amount of module nesting.
Always `>= 0`.
`level=0` denotes an absolute import, ex. `import name`.
`level>0` denotes a relative import, ex. `from ... import name`.
For more details, see the source code.
Returns:
An imported module, referring to the same object an an IEnv-namespaced `sys.modules` entry.
Raises:
ImportError: Cannot be called from any module not within an IEnv path.
"""
# Scan Caller for Context
## _globals contains all information for how to import.
caller_package: str | None = _globals.get('__package__')
# Compute IEnv Name
## From Caller __file__
if '__file__' in _globals:
ienv_name = compute_ienv_name(Path(_globals['__file__']))
## From Caller __name__
### This makes dynamic imports from IEnv modules also IEnv-namespaced.
elif (
'__name__' in _globals
and _globals['__name__'].startswith(_IENV_PREFIX)
and (_match := match_ienv_module_name(_globals['__name__'].split('.')[0]))
):
ienv_name = _match['ienv_name']
## Caller Invalid
else:
msg = 'An IEnv import was attempted where neither __file__ nor __name__ are present in the caller globals()'
raise RuntimeError(msg)
# Compute IEnv Module Prefix
ienv_module_prefix = compute_ienv_module_prefix(ienv_name)
# Compute Absolute Module Name
## '.' is folder separator.
## Top-level module is in a sys.path-searchable folder.
importing_submodule = False
if level == 0:
# Absolute Name is Top-Level Module
## -> 'import module.var1' (only imports module)
if '.' in name and len(fromlist) == 0:
abs_import_name = name.split('.')[0]
# INVALID: Top-Level Relative Import
## -> 'import .' (invalid syntax)
elif name == '':
msg = f'Caller attempted a top-level relative import (caller package={caller_package})'
raise ImportError(msg)
# Absolute Name is Name (any of the following)
## len(fromlist) == 0 -> 'import module'
## len(fromlist) > 0 -> 'from module import var1, ...'
## len(fromlist) > 0 -> 'from module1.module2 import var1, ...'
else:
abs_import_name = name
elif level > 0:
if caller_package is None:
msg = 'Caller attempted a relative import, but has no __package__'
raise ImportError(msg)
# Absolute Name is Current Package
## -> 'from . import var1, ...'
if name == '' and len(fromlist) > 0:
abs_import_name = caller_package
# INVALID:
## -> 'from .' (invalid syntax)
elif name == '' and len(fromlist) == 0:
msg = 'Caller attempted to import nothing from current package ({caller_package})'
raise ImportError(msg)
# Absolute Name is Package and Module
## -> 'from ...spam.ham import var1, ...'
elif name == '' and len(fromlist) > 0:
abs_import_name = '.'.join([caller_package, name])
# Absolute Name is Module
## -> 'from spam import var1, ...'
elif len(fromlist) > 0:
abs_import_name = name
importing_submodule = True
# INVALID: Top-Level Module is Relative
## -> 'import .module.var1'
elif '.' in name and len(fromlist) == 0:
msg = f'Caller attempted to import its own package ({caller_package})'
raise ImportError(msg)
# Compute (Absolute) Module Name w/wo IEnv-Specific Prefix
## Imported with Non-IEnv-Prefixed Name
if importing_submodule:
print(abs_import_name)
# print(sys.modules)
if not abs_import_name.startswith(ienv_module_prefix) and not importing_submodule:
# module_name = abs_import_name
ienv_module_name = ienv_module_prefix + abs_import_name
## Imported with IEnv-Prefixed Name
else:
# module_name = abs_import_name.removeprefix(ienv_module_prefix)
ienv_module_name = abs_import_name
# Lookup IEnv-Prefixed (Absolute) Module Name in sys.modules
## This preserves the caching behavior of __import__.
## This snippet is the ONLY reason to override __import__.
if (_module := sys.modules.get(ienv_module_name)) is not None:
return _module
# Import IEnv-Prefixed (Absolute) Module Name
## The builtin __import__ statement will use 'sys.meta_path' to import the module.
## We've injected a custom "Finder" into 'sys.meta_path'.
## Our custom "Finder" will ensure that 'sys.modules' is filled with 'ienv_module_name'.
return builtins__import(
ienv_module_name,
globals=_globals,
locals=_locals,
fromlist=fromlist,
level=level,
)
####################
# - __import__ Replacement
####################
def _import(
name,
globals=None, # noqa: A002
locals=None, # noqa: A002
fromlist=(),
level=0,
) -> types.ModuleType:
if (
## Never Hijack stdlib Imports
name not in sys.stdlib_module_names
## Never Hijack "Special" Imports
and name not in ALWAYS_PASSTHROUGH
## Only Hijack if Caller has Globals
and globals is not None
## Hijack if Caller in IEnv (determined by __file__ or __name__)
and (
# Detect that Caller is in IEnv by __file__
'__file__' in globals
and is_in_ienv(globals['__file__'])
# Detect that Caller is in IEnv by __package__ == __name__
## __init__.py may not have __file__; this is how we detect that.
# or (
# '__file__' not in globals
# and '__package__' in globals
# and '__name__' in globals
# and globals['__name__'] == globals['__package__']
# and globals['__name__'].startswith(_IENV_PREFIX)
# )
or (
'__file__' not in globals
and '__path__' in globals
and len(globals['__path__']) > 0
and is_in_ienv(globals['__path__'][0])
)
)
):
return import_ienv(
name, _globals=globals, _locals=locals, fromlist=fromlist, level=level
)
return builtins__import(
name, globals=globals, locals=locals, fromlist=fromlist, level=level
)
# _ArrayFunctionDispatcher
####################
# - IEnv Module Info
####################
class ModuleType(enum.StrEnum):
Source = enum.auto() ## File w/Python Code (.py)
Bytecode = enum.auto() ## File w/Python Bytecode (.pyc)
Extension = enum.auto() ## Compiled Extension Module (.so/.dll)
Package = enum.auto() ## Folder w/__init__.py
Namespace = enum.auto() ## Folder w/o _-init__.py
Builtin = enum.auto() ## stdlib Modules (compiled into the Python interpreter)
Frozen = enum.auto() ## Compiled into Python interpreter
# ModuleType to Loader Mapping
## Almost identical call signatures:
## - SourceFileLoader: (fullname, path)
## - SourcelessFileLoader: (fullname, path)
## - ExtensionFileLoader: (fullname, path)
## - BuiltinImporter: ()
## - Frozen: ()
_MODULE_LOADERS: dict[ModuleType, importlib.abc.Loader] = {
ModuleType.Source: importlib.machinery.SourceFileLoader,
ModuleType.Bytecode: importlib.machinery.SourcelessFileLoader,
ModuleType.Extension: importlib.machinery.ExtensionFileLoader,
ModuleType.Package: importlib.machinery.SourceFileLoader, ## Load __init__.py
ModuleType.Namespace: None,
ModuleType.Builtin: importlib.machinery.BuiltinImporter,
ModuleType.Frozen: importlib.machinery.FrozenImporter,
}
@dataclasses.dataclass(frozen=True, kw_only=True)
class IEnvModuleInfo:
"""Information about a module that can be depended on by an IEnv.
Based on the IEnv-specific name of the module, information about the IEnv that depends on this module can be computed.
Such information is available as computed properties.
This module is always associated with a subpath of `ienv_deps_path`.
In particular, the module always has a ModuleType of one of:
- ModuleType.Source
- ModuleType.Bytecode
- ModuleType.Extension
- ModuleType.Package
- ModuleType.Namespace
"""
ienv_module_name: AbsoluteModuleName
####################
# - IEnv Info wrt. Module
####################
@functools.cached_property
def ienv_name(self) -> IEnvName:
if match := match_ienv_module_name(self.ienv_module_name):
return match['ienv_name']
msg = f'Parsing IEnv Name from Module "{self.ienv_module_name}" failed; is the module prefixed with "{_IENV_PREFIX}"?'
raise RuntimeError(msg)
@property
def ienv_prefix(self) -> ModuleNamePrefix:
return compute_ienv_module_prefix(self.ienv_name)
@property
def ienv_deps_path(self) -> Path:
return compute_ienv_deps_path(self.ienv_name)
####################
# - Module Info
####################
@functools.cached_property
def module_name(self) -> AbsoluteModuleName:
return self.ienv_module_name.removeprefix(self.ienv_prefix)
@functools.cached_property
def module_path(self) -> Path:
"""Computes the path to this module, guaranteeing that it is either a directory or a file.
When the module is a file, all supported module suffixes are tested.
If no files with a supported module suffix match an existing file, then an `ImportError` is thrown.
If more than one file exists at the path with a supported module suffix, we're left with a question of "module suffix precedence".
There are two philosophies about how to deal with this:
- SystemError (default): Since the Python language doesn't specify which file to load, the choice is ambiguous, and the program cannot continue. We should therefore throw an explicit SystemError to encourage users to complain about the lack of specification. **May break some libraries** (but maybe they shouldn't work to begin with)
- CPython Precedence: Since Python has a de-facto implementation, we should fallback to its behavior. In `importlib/_bootstrap_external.py` we can clearly see, in `_get_supported_file_loaders()`, that the precedence goes (highest to lowest): **Extensions, source, bytecode**.
Use the module-level `_USE_CPYTHON_MODULE_SUFFIX_PRECEDENCE` global variable to select which behavior you prefer.
**Note that "CPython Precedence" will NOT try to match CPython's precedence within each category of suffix.**
Returns:
The path to the module itself
Raises:
ImportError: The computed path isn't a directory, and NO file exists at the path with a supported module suffix.
SystemError: The computed path isn't a directory, >1 file could potentially be imported, and CPython module suffix precedence is not in use.
"""
# Load the Module Path w/o Extension
module_path_noext = self.ienv_deps_path / Path(*self.module_name.split('.'))
# Is Directory: Directories Don't Have FILE Extensions!
if module_path_noext.is_dir():
return module_path_noext
module_path_candidates = [
module_path_candidate
for module_suffix in importlib.machinery.all_suffixes()
if (
module_path_candidate := module_path_noext.with_suffix(module_suffix)
).is_file()
]
if len(module_path_candidates) == 1:
return module_path_candidates[0]
if len(module_path_candidates) == 0:
msg = f'Computed module base path {module_path_noext} for {self.ienv_module_name} does not have a file with a valid module extension'
raise ImportError(msg)
# >1 Module Path Candidates
## We can choose to approximate CPython's module suffix precedence.
## Or, we throw an error, since module choice is ambiguous.
if _USE_CPYTHON_MODULE_SUFFIX_PRECEDENCE:
module_path_candidates.sort(
key=lambda el: (
3 * int(el.suffix in importlib.machinery.EXTENSION_SUFFIXES)
+ 2 * int(el.suffix in importlib.machinery.SOURCE_SUFFIXES)
+ 1 * int(el.suffix in importlib.machinery.BYTECODE_SUFFIXES)
)
)
return module_path_candidates[0]
msg = f'Computed module base path {module_path_noext} for {self.ienv_module_name} does not have ONE, unambiguous file from which to load a module; it has {len(module_path_candidates)}'
raise SystemError(msg)
@functools.cached_property
def module_type(self) -> ModuleType:
"""Computes the type of this module.
Raises:
ValueError: If the suffix of the module path doesn't indicate a valid Python module.
RuntimeError: If the module path couldn't matched to a module type, or the module path is no longer a directory or file.
`self.module_path` should guarantee that the module path is either a directory or a file.
"""
# Module Path is Directory: Package or Namespace
if self.module_path.is_dir():
# Module is Package
if (self.module_path / '__init__.py').is_file():
return ModuleType.Package
# Module is Namespace
return ModuleType.Namespace
if self.module_path.is_file():
module_file_extension = ''.join(self.module_path.suffixes)
if module_file_extension not in importlib.machinery.all_suffixes():
msg = f"The file {self.module_path} has a suffix {module_file_extension} which the current Python process doesn't recognize as a valid Python module extension. Is the file extension compatible with the current OS?"
raise ValueError(msg)
# Module is Source File
if module_file_extension in importlib.machinery.SOURCE_SUFFIXES:
return ModuleType.Source
# Module is Bytecode
if module_file_extension in importlib.machinery.BYTECODE_SUFFIXES:
return ModuleType.Bytecode
# Module is Compiled Extension
if module_file_extension in importlib.machinery.EXTENSION_SUFFIXES:
return ModuleType.Extension
msg = f'Module {self.module_path} refers to a valid module file in this context, but the suffix {module_file_extension} could not be matched to a known module type. Please contact the author of IEnv'
raise RuntimeError(msg)
msg = f"Computed module path {self.module_path} is neither a directory or a file. This shouldn't happen; most likely, the path was changed by another process"
raise RuntimeError(msg)
####################
# - IEnv Module Spec/Loader
####################
@functools.cached_property
def module_source_path(self) -> Path:
if self.module_type == ModuleType.Package:
return self.module_path / '__init__.py'
if self.module_type == ModuleType.Namespace:
return None
return self.module_path
@property
def module_loader(self) -> Path:
"""Selects an appropriate loader for this module."""
return _MODULE_LOADERS[self.module_type](
self.ienv_module_name, os.fspath(self.module_source_path)
)
@property
def module_spec(self) -> importlib.machinery.ModuleSpec:
"""Construct a ModuleSpec with appropriate attributes.
We select module attributes via the ModuleSpec constructor, according to the following understanding of Python's import semantics.
ModuleSpec -> __spec__
Controls the entire import process of a module.
Its attributes set the module attributes.
When __spec__.parent is undefined, __package__ is used.
__main__ has a special __spec__, which might be None.
name -> __name__
Identifies the module in sys.modules.
loader -> __loader__
Actually loads the module on import.
origin -> __file__
Path to the file from which this module is loaded.
If the module isn't loaded from a file, this is None.
MUST be 'None' for Namespace modules.
NEVER defined for Builtin/Frozen modules.
MAY be left undefined for domain-specific reasons.
submodule_search_locations -> __path__
ONLY set for package modules (and may be empty).
(The DEFINITION of a package is "a module with __path__")
In this context, namespace packages are "packages".
loader_state
Module-specific data provided to the loader.
Unused.
cached -> __cached__
MAY be defined IF __file__ is defined.
Path to a compiled version of __file__.
Doesn't have to point to a path that exists.
MAY be set without __file__, but this is atypical.
Can be set to None if compiled code isn't used.
parent -> __package__
For __init__.py, this is the same as 'name'
For top-level modules, this is ''.
Else, this is the absolute path to the module's parent package.
When __package__ is undefined, __spec__.parent is used.
has_location
When True, 'origin' is a loadable location.
When False, it is not.
Note, this is merely a hint given to the Loader.
is_package
Following InspectLoader.is-package, namespace packages are not "packages".
"""
spec = importlib.machinery.ModuleSpec(
self.ienv_module_name, ## __name__
self.module_loader, ## __loader__
origin=os.fspath(self.module_source_path),
loader_state=None,
is_package=self.module_type == ModuleType.Package,
)
spec.submodule_search_locations = (
[os.fspath(self.module_path)]
if self.module_type in {ModuleType.Package, ModuleType.Namespace}
else None
)
spec.cached = None
print(spec)
print('SEARCH', spec.submodule_search_locations)
# print(spec.loader.name)
# print(spec.loader.path)
return spec
####################
# - sys.meta_path Finder
####################
class IEnvMetaPathFinder(importlib.abc.MetaPathFinder):
@staticmethod
def find_spec(
fullname: str,
path: str | None, # noqa: ARG004
target: types.ModuleType | None = None, # noqa: ARG004
) -> importlib.machinery.ModuleSpec | None:
"""When the import 'fullname' has the IEnv prefix, load the module from the IEnv deps path."""
if fullname.startswith(_IENV_PREFIX):
mod_info = IEnvModuleInfo(ienv_module_name=fullname)
return mod_info.module_spec
# Pass to Next MetaPathFinder
return None
####################
# - Initialization
####################
def init(
ienv_base_path: Path,
always_passthrough: set[ModuleName] = frozenset(),
):
"""Initialize IEnv handling."""
global IENV_BASE_PATH, ALWAYS_PASSTHROUGH # noqa: PLW0603
IENV_BASE_PATH = ienv_base_path
ALWAYS_PASSTHROUGH = always_passthrough
is_in_ienv.cache_clear()
compute_ienv_name.cache_clear()
compute_ienv_deps_path.cache_clear()
## compute_ienv_module_prefix uses no globals
# Modify Builtins
## You can always get the original back via 'ienv.builtins__import()'
builtins.__import__ = _import
# Add MetaPathFinder
## You can always get the original back via 'ienv.builtins__import()'
sys.meta_path.insert(0, IEnvMetaPathFinder)
def free():
"""Cease IEnv handling, affecting only **new** ienv-dependent imports.
Nothing is deleted from `sys.modules`.
As a result, if `import name` was IEnv-dependent, then:
- Variables referring to an IEnv-dependent module will still work.
- `sys.modules[ienv_prefix + 'name']` will still refer to the IEnv-dependent module.
- Any stored IEnv-dependent `name`, ex. in a variable or a callback, will still refer to the IEnv-dependent module.
There are a few gotchas (_Don't Do This_):
- Dynamic ienv-dependent imports **will not work**.
- `import _ienv_ienvname__name` will **only** work if `sys.modules` still caches that name.
"""
global IENV_BASE_PATH, ALWAYS_PASSTHROUGH # noqa: PLW0603
IENV_BASE_PATH = None
ALWAYS_PASSTHROUGH = None
# Modify Builtins
builtins.__import__ = builtins__import
# Remove MetaPathFinder
sys.meta_path.remove(IEnvMetaPathFinder)

23
src/ienv/main.py 100644
View File

@ -0,0 +1,23 @@
from pathlib import Path
import ienv
import rich.traceback
rich.traceback.install(show_locals=True)
if __name__ == '__main__':
# Modify Import Machinery
ienv_base_path: Path = Path(__file__).resolve().parent / 'addons'
ienv.init(ienv_base_path, always_passthrough={'rich'})
# Addon-Specific Imports Now Work
print('Importing Addon 1')
from addons import addon_1
print('Importing Addon 2')
from addons import addon_2
# Test Addons
print()
print('Addon 1 Function: np.__file__:', addon_1.np_file_addon_1())
print('Addon 2 Function: np.__file__:', addon_2.np_file_addon_2())

View File

@ -38,7 +38,7 @@ def zipped_addon( # noqa: PLR0913
path_addon_pkg: Path to the folder containing __init__.py of the Blender addon.
path_addon_zip: Path to the Addon ZIP to generate.
path_pyproject_toml: Path to the `pyproject.toml` of the project.
This is made available to the addon, to de-duplicate definition of name,
This is made available to the addon, to de-duplicate definition of name,
The .zip file is deleted afterwards, unless `remove_after_close` is specified.
"""
# Delete Existing ZIP (maybe)