TilliFe
TilliFe
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
I am currently trying out a lot of new stuff for Endia in private, I will post some updates soon. Chears!
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
A brief summary of Custom Ops :mojo: https://x.com/endia_ai/status/1854883860888121651
57 replies
MModular
Created by Caroline on 10/17/2024 in #community-showcase
Modverse #43: MAX 24.5, our biggest Mojo update ever, and Mojo's debut in the TIOBE index
Amazing overview :mojonightly: Thank you!
9 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
I am not sure i f this helps you in any way, if not let's just take it as a checkpoint on what Endia can currently do and what not. 🙃 I'd be super happy to hear more of what exactly you would like to see in the long term. Could you create a list of features (possibly with some examples) so that we all can learn a bit more about the powers of function transformations. That would be super awesome! :mojo:
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
#########################################################
# Second round - create second branch
#########################################################

# Initialize input to take the else branch in foo...
x = nd.array("[100.0, 2.0, 3.0]")

# Compute result and derivatives (with type hints)
y = foo_jitted(x)[nd.Array]
dy_dx = foo_jac(x)[nd.Array]
d2y_dx2 = foo_hes(x)[nd.Array]

# Print results
print(str(y)) # -9913.0
print(str(dy_dx)) # [-200.0, -4.0, -6.0]
print(str(d2y_dx2)) # [[-2.0, 0.0, 0.0],
# [0.0, -2.0, 0.0],
# [0.0, 0.0, -2.0]]


#########################################################
# Third round - use second branch again
#########################################################

# Initialize input to take the else branch in foo...
x = nd.array("[200.0, 2.0, 3.0]")

# Compute result and derivatives (with type hints)
y = foo_jitted(x)[nd.Array]
dy_dx = foo_jac(x)[nd.Array]
d2y_dx2 = foo_hes(x)[nd.Array]

# Print results
print(str(y)) # -39913.0
print(str(dy_dx)) # [-400.0, -4.0, -6.0]
print(str(d2y_dx2)) # [[-2.0, 0.0, 0.0],
# [0.0, -2.0, 0.0],
# [0.0, 0.0, -2.0]]
#########################################################
# Second round - create second branch
#########################################################

# Initialize input to take the else branch in foo...
x = nd.array("[100.0, 2.0, 3.0]")

# Compute result and derivatives (with type hints)
y = foo_jitted(x)[nd.Array]
dy_dx = foo_jac(x)[nd.Array]
d2y_dx2 = foo_hes(x)[nd.Array]

# Print results
print(str(y)) # -9913.0
print(str(dy_dx)) # [-200.0, -4.0, -6.0]
print(str(d2y_dx2)) # [[-2.0, 0.0, 0.0],
# [0.0, -2.0, 0.0],
# [0.0, 0.0, -2.0]]


#########################################################
# Third round - use second branch again
#########################################################

# Initialize input to take the else branch in foo...
x = nd.array("[200.0, 2.0, 3.0]")

# Compute result and derivatives (with type hints)
y = foo_jitted(x)[nd.Array]
dy_dx = foo_jac(x)[nd.Array]
d2y_dx2 = foo_hes(x)[nd.Array]

# Print results
print(str(y)) # -39913.0
print(str(dy_dx)) # [-400.0, -4.0, -6.0]
print(str(d2y_dx2)) # [[-2.0, 0.0, 0.0],
# [0.0, -2.0, 0.0],
# [0.0, 0.0, -2.0]]
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
import endia as nd


# Define the function
def foo(x: nd.Array) -> nd.Array:
if x.load(0) < 10:
return nd.sum(x**2)
else:
return - nd.sum(x**2) + 100

def main():

# Create callables for the jacobian and hessian
foo_jitted = nd.jit(foo)
foo_jac = nd.grad(foo_jitted)
foo_hes = nd.grad(foo_jac)

#########################################################
# First round - create inital branch
#########################################################

# Initialize input to take the if branch in foo...
x = nd.array("[1.0, 2.0, 3.0]")

# Compute result and derivatives (with type hints)
y = foo_jitted(x)[nd.Array]
dy_dx = foo_jac(x)[nd.Array]
d2y_dx2 = foo_hes(x)[nd.Array]

# Print results
print(str(y)) # 14.0
print(str(dy_dx)) # [2.0, 4.0, 6.0]
print(str(d2y_dx2)) # [[2.0, 0.0, 0.0],
# [0.0, 2.0, 0.0],
# [0.0, 0.0, 2.0]]
import endia as nd


# Define the function
def foo(x: nd.Array) -> nd.Array:
if x.load(0) < 10:
return nd.sum(x**2)
else:
return - nd.sum(x**2) + 100

def main():

# Create callables for the jacobian and hessian
foo_jitted = nd.jit(foo)
foo_jac = nd.grad(foo_jitted)
foo_hes = nd.grad(foo_jac)

#########################################################
# First round - create inital branch
#########################################################

# Initialize input to take the if branch in foo...
x = nd.array("[1.0, 2.0, 3.0]")

# Compute result and derivatives (with type hints)
y = foo_jitted(x)[nd.Array]
dy_dx = foo_jac(x)[nd.Array]
d2y_dx2 = foo_hes(x)[nd.Array]

# Print results
print(str(y)) # 14.0
print(str(dy_dx)) # [2.0, 4.0, 6.0]
print(str(d2y_dx2)) # [[2.0, 0.0, 0.0],
# [0.0, 2.0, 0.0],
# [0.0, 0.0, 2.0]]
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
Hi 🧙, at the moment I'd prefer to answer that it is still a bit too early for these kind of flexible transformations. I am currently overthinking most parts of the endia core and I am planning to rebuild things from scratch (again). So things will hopefully become better and more flexible in the next iteration. Wrt. to custom transformations, I would like to refer to Endia's custom_ops as described in https://endia.vercel.app/docs/custom_ops. Nonetheless, I wrote out a little program which basically gives a more or less comprehensive overview of what Endia can currently do in terms of function transformations and how one might apply control flow: 1. We create a function foo which has some control flow inside of it. 2. We create a jitted (optional, but for the sake of concatenating transforms let's do it here too) version of this function and pass it to the grad and jacobian function transformations. 3. Then, in the three following rounds, we use those transformed functions (which are basically just a bunch of custom structs called Callables) and pass a differently initialized x into them and check if the transformed versions branch correctly.
57 replies
MModular
Created by Martin Dudek on 9/16/2024 in #questions
to_numpy with Mojo 24.5
I think I have been there too. Does the following example from the MAX repo help as a reference? https://github.com/modularml/max/blob/434daac5b52226d8c7ea024d08df127af5fce9dd/examples/serve/openclip-mojo-onnx/python_utils.mojo#L61
9 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
Hi Martin, I am indeed planning to integrate those features, at least on a high level. Things like high level modules and some standard nn models. The image shall simply display some possible applications (stuff I am personally really interested in), but it definitely does not cover all possible applications of a comprehensive Array library. (Just look at where NumPy is used nowadays, even in satelites floating through space...) Thank you for mentioning this, I should write down something like a roadmap to make things more transparent.
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
I fixed JIT compilation with MAX. :mojo: What was the problem? If you previously ran the simple MLP benchmarks inside Endia's benchmarks directory, you might have noticed that the version using MAX for JIT compiling Endia Subgraphs, took for ages compared to not using MAX. Why? When transferring data from the Endia Graph to the MAX Graph/Model and back, we did not properly make use of TensorMaps, but converted arguments (a List of Endia Arrays) to a list of NumPy arrays first (a List of PythonObjects, expensive!), which are then again being converted to a set of MAX Tensors for further use by the MAX engine). This sounds indeed terrible 🤦‍♂️ and I only saw the obvious alternative now: We can create MAX Tensors as inputs to an executable MAX Model which do not own their data pointer! From now on, the inputs to a MAX Model merely borrow UnsafePointers from Endia Arrays for the duration of the MAX Model execution. Additionally, outputs from a MAX Model were previously copied (also super expensive). Now, since those outputs will usually be destroyed, we can just steal the outputs' UnsafePointers and let Endia Array own them after execution. All in all, there are no unnecessary data copies anymore, and Endia and MAX can now work on the same data. This dramatically speeds up JIT compilation in Endia. Due to MAX' highly optimized ops, this speedup will be especially significant, when training larger Neural Networks. Ultimately, this also gives me confidence, that Endia can greatly benefit from using MAX once it'll support GPU. Cheers! 🧙
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
Absolutely! :mojo: I had a similar idea and wanted to reach out to you on that as well.
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
Endia's FFT implementation, despite its compactness, delivers performance not far behind established frameworks. Further optimizations and algorithmic refinements could push Endia's performance to fully match or even exceed existing solutions.
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
No description
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
New Fast Fourier Transform Module in Endia. 🌊 https://x.com/fe_tilli/status/1827434391330558226
57 replies
MModular
Created by HALFPOTATO on 8/8/2024 in #questions
Autodiff For Mojo
Hi, yeah usability-wise (and also performance-wise) definitely still a ton of difference, since Endia is in very active development. Endia shall soon become very similar to PyTorch, however the goal is to make - what Pytorch currently has in its functional Module - a true first class citzen in Endia. If this will work as I envision, Endia can also be used like a more functional ML framwork, similar to JAX. This would mean people with different philosophies of coding could come together and use just one engine seamlessly, which would be pretty nice. As Benny mentioned, implementation-wise, Endia is completely different to any other ML framework, since it is written in Mojo. A single layer stack basically. Chears 🧙
10 replies
MModular
Created by HALFPOTATO on 8/8/2024 in #questions
Autodiff For Mojo
I agree.
10 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
Endia nightly now uses MAX/Mojo nightly. This was long due. 👷‍♀️ 👷‍♂️ 🧙
57 replies
MModular
Created by TilliFe on 7/18/2024 in #community-showcase
Endia
Good question, lets first test if the import of modules work at all on your machine, then we try to generalize this for any kind of external module that shall be used in a (nested) project. Basic Import Test: Create a new directory test and copy the endia.package inside of it. Then next to it create a file where you try to import endia. If that works you can checkout the next step. Example:
.
├── endia.mojopkg
└── use_endia_here.mojo
.
├── endia.mojopkg
└── use_endia_here.mojo
General Usage: - If you build a nested module/a project with a lot of subdirectories, make sure that all subfolders that use an external module (e.g. Endia) have a __init__.mojo file. This will modularize the subfolders. Check out the docs for more information: https://docs.modular.com/mojo/manual/packages. - Once you have modularized your project, you can place the endia.package at the top level of your directory. Then you should be able to import endia at any level. Example:
.
├── endia.mojopkg
├── level1_dir
├── __init__.mojo
└── use_endia_here.mojo
└── run_level1stuff_here.mojo
.
├── endia.mojopkg
├── level1_dir
├── __init__.mojo
└── use_endia_here.mojo
└── run_level1stuff_here.mojo
57 replies