Launching Model 13.1 of Wolfram Language & Mathematica 🙀🤠🥳—Stephen Wolfram Writings
[ad_1]
The Epic Continues…
Final week it was 34 years for the reason that authentic launch of Mathematica and what’s now the Wolfram Language. And thru all these years we’ve energetically continued constructing additional and additional, including ever extra capabilities, and steadily extending the area of the computational paradigm.
In recent times we’ve established one thing of a rhythm, delivering the fruits of our improvement efforts roughly twice a 12 months. We launched Model 13.0 on December 13, 2021. And now, roughly six months later, we’re releasing Model 13.1. As ordinary, regardless that it’s a “.1” launch, it’s bought plenty of new (and up to date) performance, a few of which we’ve labored on for a few years however lastly now delivered to fruition.
For me it’s at all times thrilling to see what we handle to ship in every new model. And in Model 13.1 we’ve 90 fully new capabilities—in addition to 203 current capabilities with substantial updates. And past what seems in particular capabilities, there’s additionally main new performance in Model 13.1 in areas like consumer interfaces and the compiler.
The Wolfram Language because it exists at this time encompasses an enormous vary of performance. However its nice energy comes not simply from what it accommodates, but in addition from how coherently all the things in it matches collectively. And for practically 36 years I’ve taken it as a private accountability to make sure that that coherence is maintained. It’s taken each nice focus and many deep mental work. However as I expertise them day by day in my use of the Wolfram Language, I’m pleased with the outcomes.
And for the previous 4 years I’ve been sharing the “behind the scenes” of the way it’s achieved—by livestreaming our Wolfram Language design assessment conferences. It’s an unprecedented stage of openness—and engagement with the neighborhood. In designing Model 13.1 we’ve carried out 90 livestreams—lasting greater than 96 hours. And in opening up our course of we’re offering visibility not solely into what was constructed for Model 13.1, but in addition of why it was constructed, and the way selections about it have been made.
However, OK, so what lastly is in Model 13.1? Let’s speak about some highlights….
Past Listability: Introducing Threaded
From the very starting of Mathematica and the Wolfram Language we’ve had the idea of listability: for those who add two lists, for instance, their corresponding components might be added:

It’s a really handy mechanism, that usually does precisely what you’d need. And for 35 years we haven’t actually thoughtabout extending it. But when we take a look at code that will get written, it usually occurs that there are elements that principally implement one thing very very similar to listability, however barely extra basic. And in Model 13.1 we’ve a brand new symbolic assemble, Threaded, that successfully permits you to simply generalize listability.
Contemplate:

This makes use of unusual listability, successfully computing:

However what if you need as a substitute to “go down a stage” and thread {x,y} into the bottom elements of the primary checklist? Effectively, now you should utilize Threaded to try this:

By itself, Threaded is only a symbolic wrapper:

However as quickly because it seems in a operate—like Plus—that has attribute Listable, it specifies that the listability ought to be utilized after what’s specified inside Threaded is “threaded” on the lowest stage.
Right here’s one other instance. Create a listing:

How ought to we then multiply every ingredient by {1,–1}? We might do that with:

However now we’ve bought Threaded, and so as a substitute we are able to simply say:

You can provide Threaded as an argument to any listable operate, not simply Plus and Occasions:

You should use Threaded and unusual listability collectively:

You may have a number of Threadeds collectively as nicely:

Threaded, by the way in which, will get its identify from the operate Thread, which explicitly does “threading”, as in:

By default, Threaded will at all times thread into the bottom stage of a listing:


Right here’s a “reallife” instance of utilizing Threaded like this. The info in a 3D coloration picture consists of a rank3 array of triples of RGB values:

This multiplies each RGB triple by {0,1,2}:

More often than not you both wish to use unusual listability that operates on the high stage of a listing, otherwise you wish to use the default type of Threaded, that operates on the lowest stage of a listing. However Threaded has a extra basic type, in which you’ll be able to explicitly say what stage you need it to function at.
Right here’s the default case:

Right here’s stage 1, which is rather like unusual listability:

And right here’s threading into stage 2:

Threaded gives a really handy strategy to do all types of arraycombining operations. There’s further complexity when the thing being “threaded in” itself has a number of ranges. The default on this case is to align the bottom stage within the factor being threaded in with the bottom stage of the factor into which it’s being threaded:

Right here now could be “unusual listability” conduct:

For the arrays we’re right here, the default conduct is equal to:

Generally it’s clearer to put in writing this out in a type like

which says that the primary stage of the array contained in the Threaded is to be aligned with the second stage of the skin array. Basically, the default case is equal to –1 → –1, specifying that the underside stage of the array contained in the Threaded ought to be aligned with the underside stage of the array exterior.
But Extra Language Comfort Features
In each model of the Wolfram Language we attempt to add new capabilities that may make basic applications simpler to put in writing and simpler to learn. In Model 13.1 an important such operate is Threaded. However there are fairly a couple of others as nicely.
First in our assortment for Model 13.1 is DeleteElements, which deletes specified components from a listing. It’s like Complement, besides that it doesn’t reorder the checklist (analogous to the way in which DeleteDuplicates removes duplicate components, with out reordering in the way in which that Union does):

DeleteElements additionally permits extra detailed management of what number of copies of a component could be deleted. Right here it’s as much as 2 b’s and three c’s:

Speaking of DeleteDuplicates, one other new operate in Model 13.1 is DeleteAdjacentDuplicates:

We’ve had Union, Intersection and Complement since Model 1.0. In Model 13.1 we’re including SymmetricDifference: discover components that (within the 2argument case) are in a single checklist or the opposite, however not each. For instance, what international locations are within the G20 or the EU, however not each?

Let’s say you might have a number of lists, and also you wish to know what components are distinctive to only one in every of these lists, and don’t happen in a number of lists. The brand new UniqueElements tells one.
For example, this tells us which letters uniquely happen in varied alphabets:

We’ve had Map and Apply, with quick types /@ and @@, ever since Model 1.0. In Model 4.0 we added @@@ to characterize Apply[f,expr,1]. However we by no means added a separate operate to correspond to @@@. And through the years, there’ve been fairly a couple of events the place I’ve principally wished, for instance, to do one thing like “Fold[@@@, ...]”. Clearly Fold[Apply[#1,#2,1]&,...] would work. But it surely feels as if there’s a “lacking” named operate. Effectively, in Model 13.1, we added it: MapApply is equal to @@@:

One other small comfort added in Model 13.1 is SameAs—basically an operator type of SameQ. Why is such a assemble wanted? Effectively, there are at all times tradeoffs in language design. And again in Model 1.0 we determined to make SameQ work with any variety of arguments (so you possibly can take a look at whether or not a complete sequence of issues are the identical). However which means that for consistency SameQ[expr] should at all times return True—so it’s not out there as an operator of SameQ. And that’s why now in Model 13.1 we’re including SameAs, that joins the household of operatorform capabilities like EqualTo and GreaterThan:

Procedural programming—usually with “variables hanging out”—isn’t the popular fashion for many Wolfram Language code. However typically it’s probably the most handy strategy to do issues. And in Model 13.1 we’ve add a small piece of streamlining by introducing the operate Till. Ever since Model 1.0 we’ve had Whereas[test,body] which repeatedly evaluates physique whereas take a look at is True. But when take a look at isn’t True even at first, Whereas received’t ever consider physique. Till[test,body] does issues the opposite approach round: it evaluates physique till take a look at turns into True. So if take a look at isn’t True at first, Till will nonetheless consider physique as soon as, in impact solely trying on the take a look at after it’s evaluated the physique.
Final however not least within the checklist of latest core language capabilities in Model 13.1 is ReplaceAt. Exchange makes an attempt to use a alternative rule to an entire expression—or a complete stage in an expression. ReplaceAll (/.) does the identical factor for all subparts of an expression. However very often one needs extra management over the place replacements are carried out. And that’s what ReplaceAt gives:

An essential characteristic is that it additionally has an operator type:

Why is that this essential? The reply is that it offers a symbolic strategy to specify not simply what alternative is made, but in addition the place it’s made. And for instance that is what’s wanted in specifying steps in proofs, say as generated by FindEquationalProof.
Emojis! And Extra Multilingual Help
What’s a personality? Again when Model 1.0 was launched, characters have been represented as 8bit objects: often ASCII, however you might choose one other “character encoding” (therefore the ChararacterEncoding possibility) for those who wished. Then within the early Nineties got here Unicode—which we have been one of many very first corporations to help. Now “characters” could possibly be 16bit constructs, with practically 65,536 potential “glyphs” allotted throughout totally different languages and makes use of (together with some mathematical symbols that we launched). Again within the early Nineties Unicode was a newfangled factor, that working techniques didn’t but have builtin help for. However we have been betting on Unicode, and so we constructed our personal infrastructure for dealing with it.
Thirty years later Unicode is certainly the common normal for representing characterlike issues. However someplace alongside the way in which, it turned out the world wanted greater than 16 bits’ value of characterlike issues. At first it was about supporting variants and historic writing techniques (assume: cuneiform or Linear B). However then got here emoji. And it turned clear that—sure, arguably in a return to the Egyptian hieroglyph fashion of communication—there was an nearly infinite variety of potential pictorial emoji that could possibly be made, every of them being encoded as their very own Unicode code level.
It’s been a sluggish enlargement. Authentic 16bit Unicode is “airplane 0”. Now there are as much as 16 further planes. Not fairly 32bit characters, however given the way in which computer systems work, the strategy now could be to permit characters to be represented by 32bit objects. It’s removed from trivial to try this uniformly and effectively. And for us it’s been a protracted course of to improve all the things in our system—from string manipulation to pocket book rendering—to deal with full 32bit characters. And that’s lastly been achieved in Model 13.1.
However that’s removed from all. In English we’re just about used to having the ability to deal with textual content as a sequence of letters and different characters, with every character being separate. Issues get a bit extra difficult once you begin to fear about diphthongs like æ. But when there are pretty few of those, it really works to only introduce them as particular person “Unicode characters” with their very own code level. However there are many languages—like Hindi or Khmer—the place what seems in textual content like a person character can be a composite of letterlike constructs, diacritical marks and different issues. Such composite characters are usually represented as “grapheme clusters”: runs of Unicode code factors. The principles for dealing with this stuff could be fairly difficult. However after a few years of improvement, main working techniques now efficiently do it most often. And in Model 13.1 we’re capable of make use of this to help such constructs in notebooks.
OK, so what does 32bit Unicode seem like? Utilizing CharacterRange (or FromCharacterCode) we are able to dive in and simply see what’s on the market in “character area”. Right here’s a part of unusual 16bit Unicode area:

Right here’s a few of what occurs in “plane1” above character code 65535, on this case catering to “legacy computations”:

Aircraft0 (beneath 65535) is just about all full. Above that, issues are sparser. However round 128000, for instance, there are many emoji:

You should use these within the Wolfram Language, and in notebooks, similar to every other characters. So, for instance, you possibly can have wolf and ram variables:

The 🐏 types earlier than the 🐺 as a result of it occurs to have a numerically smaller character code:

In a pocket book, you possibly can enter emoji (and different Unicode characters) utilizing normal working system instruments—like ctrlcmdarea on macOS:

The world of emoji is quickly evolving—and that may typically result in issues. Right here’s an emoji vary that features some very acquainted emoji, however on not less than one in every of my laptop techniques additionally consists of emoji that show solely as :

The rationale that occurs is that my default fonts don’t comprise glyphs for these emoji. However all just isn’t misplaced. In Model 13.1 we’re together with a font from Twitter that goals to comprise glyphs for just about all emoji:

Past coping with particular person Unicode characters, there’s additionally the matter of composites, and grapheme clusters. In Hindi, for instance, two characters can mix into one thing that’s rendered (and handled) as one:

The primary character right here can stand by itself:

However the second is principally a modifier that extends the primary character (on this explicit case including a vowel sound):

However as soon as the composite हि has been shaped it acts “textually” similar to a single character, within the sense that, for instance, the cursor strikes by it in a single step. When it seems “computationally” in a string, nevertheless, it may nonetheless be damaged into its constituent Unicode components:

This sort of setup can be utilized not just for a language like Hindi but in addition for European languages which have diacritical marks like umlauts:

Regardless that this appears like one character—and in Model 13.1 it’s handled like that for “textual” functions, for instance in notebooks—it’s in the end made up of two distinct “Unicode characters”:

On this explicit case, although, this may be “normalized” to a single character:

It appears the identical, however now it actually is only one character:

Right here’s a “mixed character” which you can type

however for which there’s no single character to which it normalizes:

The idea of composite characters applies not solely to unusual textual content, but in addition to emojis. For instance, take the emoji for a girl

along with the emoji for a microscope

and mix them with the “zerowidthjoiner” character (which, evidently, doesn’t show as something)

and also you get (sure, considerably bizarrely) a lady scientist!

For sure, you are able to do this computationally—although the “calculus” of what’s been outlined up to now in Unicode is pretty weird:

I’m kind of hoping that the way forward for semantics doesn’t find yourself being outlined by the way in which emojis mix 😎.
As one final—arguably hacky—instance of mixing characters, Unicode defines varied “twoletter” combos to be flags. Sort then , and also you get 🇺🇸!
As soon as once more, this may be made computational:

(And, sure, it’s an attentiongrabbing query what renders right here, and what doesn’t. In some working techniques, no flags are rendered, and we’ve to tug in a particular font to do it.)

It was that the one “particular key sequence” one completely ought to know in an effort to use Wolfram Notebooks was shiftenter. However step by step there have began to be increasingly more highprofile operations which can be conveniently carried out by “urgent a button”. And moderately than anticipating folks to recollect all these particular key sequences (or assume to look in menus for them) we’ve determined to introduce a toolbar that might be displayed by default in each normal pocket book. Model 13.1 has the primary iteration of this toolbar. Subsequent variations will help an growing vary of capabilities.
It’s not been simple to design the default toolbar (and we hope you’ll like what we got here up with!) The primary drawback is that Wolfram Notebooks are very basic, and there are an incredible many issues you are able to do with them—which it’s difficult to prepare right into a manageable toolbar. (Some particular kinds of notebooks have had their very own specialised toolbars for some time, which have been simpler to design by advantage of their specialization.)
So what’s within the toolbar? On the left are a few analysis controls:
means “Consider”, and is solely equal to urgent shiftret (as its tooltip says). means “Abort”, and can cease a computation. To the fitting of is the menu proven above. The primary a part of the menu permits you to select what might be evaluated. (Don’t neglect the extraordinarily helpful “Consider In Place” that allows you to consider no matter code you might have chosen—say to show RGBColor[1,0,0] in your enter into .) The underside a part of the menu offers a few extra detailed (however extremely helpful) analysis controls.
Shifting alongside the toolbar, we subsequent have:

In case your cursor isn’t already in a cell, the pulldown permits you to choose what sort of cell you wish to insert (it’s much like the “tongue” that seems throughout the pocket book). (In case your cursor is already inside a cell, then like in a typical phrase processor, the pulldown will inform you the fashion that’s getting used, and allow you to reset it.)
offers you slightly panel to manage to look of cells, altering their background colours, frames, dingbats, and so on.
Subsequent come cellrelated buttons: . The primary is for cell construction and grouping:

copies enter from above (cmdL). It’s an operation that I, for one, find yourself doing on a regular basis. I’ll have an enter that I consider. Then I’ll wish to make a modified model of the enter to guage once more, whereas preserving the unique. So I’ll copy the enter from above, edit the copy, and consider it once more.
copies output from above. I don’t discover this fairly as helpful as copy enter from above, however it may be useful if you wish to edit output for subsequent enter, whereas leaving the “precise output” unchanged.
The block is all about content material in cells. (which you’ll usually press repeatedly) is for extending a variety—in impact going ever upwards in an expression tree. (You may get the identical impact by urgent ctrl. or by multiclicking, however it’s much more handy to repeatedly press a single button than to have to exactly time your multiclicks.)
is the singlebutton strategy to get ctrl= for getting into pure language enter:

iconizes your choice:

Iconization is one thing we launched in Model 11.3, and it’s one thing that’s proved extremely helpful, notably for making code simple to learn (say by iconizing particulars of choices). (You can even iconize a variety from the rightclick menu, or with ctrlcmd'.)
is most related for code, and toggles commenting (with ) a variety. brings up a palette for math typesetting. helps you to enter that might be transformed to Wolfram Language math typesetting. brings up a drawing canvas. inserts a hyperlink (cmdshiftH).
In the event you’re in a textual content cell, the toolbar will look totally different, now sporting a textual content formatting management:
Most of that is pretty normal. helps you to insert “code voice” materials. and are nonetheless within the toolbar for inserting math right into a textual content cell.
On the righthand finish of the toolbar are three extra buttons: . offers you a dialog to publish your pocket book to the cloud. opens documentation, both particularly trying up no matter you might have chosen within the pocket book, or opening the entrance web page (“root information web page”) of the primary Wolfram Language documentation. Lastly, helps you to search in your present pocket book.
As I discussed above, what’s in Model 13.1 is simply the primary iteration of our default toolbar. Anticipate extra options in later variations. One factor that’s notable concerning the toolbar basically is that it’s 100% applied in Wolfram Language. And along with including quite a lot of flexibility, this additionally implies that the toolbar instantly works on all platforms. (By the way in which, for those who don’t need the toolbar in a selected pocket book—or for all of your notebooks—simply rightclick the background of the toolbar to choose that possibility.)
Sharpening the Person Interface
We first launched Wolfram Notebooks with Model 1.0 of Mathematica, in 1988. And ever since then, we’ve been progressively sprucing the pocket book interface, doing extra with each new model.
The ctrl= mechanism for getting into pure language (“WolframAlphastyle”) enter debuted in Model 10.0—and in Model 13.1 it’s now accessible from the button within the new default pocket book toolbar. However what really is when it’s in a pocket book? Prior to now, it’s been a reasonably advanced symbolic construction primarily appropriate for analysis. However in Model 13.1 we’ve made it a lot less complicated. And whereas that doesn’t have any direct impact for those who’re simply utilizing purely in a pocket book, it does have an impact for those who copy into one other software, like puretext e mail. Prior to now this produced one thing that will work if pasted again right into a pocket book, however positively wasn’t notably readable. In Model 13.1, it’s now merely the Wolfram Language interpretation of your pure language enter:

What occurs if the computation you do in a pocket book generates an enormous output? Ever since Model 6.0 we’ve had some type of “output limiter”, however in Model 13.1 it’s grow to be a lot sleeker and extra helpful. Right here’s a typical instance:

Speaking of massive outputs (in addition to different issues that preserve the pocket book interface busy), one other change in Model 13.1 is the brand new asynchronous progress overlay on macOS. This doesn’t have an effect on different platforms the place this drawback had already been solved, however on the Mac modifications within the OS had led to a state of affairs the place the pocket book entrance finish might mysteriously pop to the entrance in your desktop—a state of affairs that has now been resolved.
One of many barely uncommon consumer interface options that’s existed ever since Model 1.0 is the Why the Beep? menu merchandise—that allows you to get an evidence of any “error beep” that happens whilst you’re working the system. The operate Beep helps you to generate your individual beep. And now in Model 13.1 you should utilize Beep["string"] to arrange an evidence of “your beep”, that customers can retrieve by the Why the Beep? menu merchandise.
The essential pocket book consumer interface works as a lot as potential with normal interface components on all platforms, in order that when these components are up to date, we at all times robotically get the “most trendy” look. However there are elements of the pocket book interface which can be fairly particular to Wolfram Notebooks and are at all times customized. One which hadn’t been up to date for some time is the Preferences dialog—which now in Model 13.1 will get a full makeover:

Whenever you inform the Wolfram Language to do one thing, it usually simply goes off and does it, with out asking you something (nicely, until it explicitly wants enter, wants a password, and so on.) However what if there’s one thing that it may be a good suggestion to do, although it’s not strictly essential? What ought to the consumer interface for this be? It’s tough, however I feel we now have resolution that we’ve began deploying in Model 13.1.
Specifically, in Model 13.1, there’s an instance associated to the Wolfram Operate Repository. Say you utilize a operate for which an replace is on the market. What now occurs is {that a} blue field is generated that tells you concerning the replace—although it nonetheless retains going with the computation, ignoring the replace:

In the event you click on the Replace Now button within the blue field you are able to do the replace. After which the purpose is which you can run the computation once more (for instance, simply by urgent shiftenter), and now it’ll use the replace. In a way the core thought is to have an interface the place there are probably a number of passes, and the place a computation at all times runs to completion, however you might have a simple strategy to change the way it’s arrange, after which run it once more.
GiantScale Code Modifying
One of many nice issues concerning the Wolfram Language is that it really works nicely for applications of any scale—from lower than a line lengthy to thousands and thousands of strains lengthy. And for the previous a number of years we’ve been engaged on increasing our help for very massive Wolfram Language applications. Utilizing LSP (Language Server Protocol) we’ve supplied the potential for most traditional exterior IDEs to robotically do syntax coloring and different customizations for the Wolfram Language.
In Model 13.1 we’re additionally including a few options that make largescale code modifying in notebooks extra handy. The primary—and extensively requested—is block indent and outdent of code. Choose the strains you wish to indent or outdent and easily press tab or shifttab to indent or outdent them:

Ever since Model 6.0 we’ve had the power to work with .wl bundle information (in addition to .wls script information) utilizing our pocket book modifying system. A brand new default characteristic in Model 13.1 is numbering of all code strains that seem within the underlying file (and, sure, we accurately align line numbers accounting for the presence of noncode cells):

So now, for instance, for those who get a syntax error from Get or a associated operate, you’ll instantly be capable to use the road quantity it reviews to seek out the place it happens within the underlying file.
Scribbling on Notebooks
In Model 12.2 we launched Canvas as a handy interface for interactive drawing in notebooks. In Model 13.1 we’re introducing the notion of toggling a canvas on high of any cell.
Given a cell, simply choose it and press , and also you’ll get a canvas:

Now you should utilize the drawing instruments within the canvas to create an annotation overlay:

In the event you consider the cell, the overlay will keep. (You may do away with the “canvas wrapper” by making use of Regular.)
Bushes Proceed to Develop 🌱🌳
In Model 12.3 we launched Tree as a brand new elementary assemble within the Wolfram Language. In Model 13.0 we added quite a lot of styling choices for timber, and in Model 13.1 we’re including extra styling in addition to quite a lot of new elementary options.
An essential replace to the basic Tree assemble in Model 13.1 is the power to call branches at every node, by giving them in an affiliation:

All tree capabilities now embrace help for associations:

In lots of makes use of of timber the labels of nodes are essential. However notably in additional summary purposes one usually needs to cope with unlabeled timber. In Model 13.1 the operate UnlabeledTree (roughly analogously to UndirectedGraph) takes a labeled tree, and principally removes all seen labels. Here’s a normal labeled tree

and right here’s the unlabeled analog:

In Model 12.3 we launched ExpressionTree for deriving timber from basic symbolic expressions. Our plan is to have a variety of “particular timber” acceptable for representing totally different particular sorts of symbolic expressions. We’re starting this course of in Model 13.1 by, for instance, having the idea of “Dataset timber”. Right here’s ExpressionTree changing a dataset to a tree:

And now right here’s TreeExpression “inverting” that, and producing a dataset:

(Keep in mind the conference that *Tree capabilities return a tree; whereas Tree* capabilities take a tree and return one thing else.)
Right here’s a “graph rendering” of a extra difficult dataset tree:

The brand new operate TreeLeafCount helps you to depend the whole variety of leaf nodes on a tree (principally the analog of LeafCount for a basic symbolic expression):

One other new operate in Model 13.1 that’s usually helpful in getting a way of the construction of a tree with out inspecting each node is RootTree. Right here’s a random tree:

RootTree can get a subtree that’s “near the basis”:

It may additionally get a subtree that’s “removed from the leaves”, on this case happening to components which can be at stage –2 within the tree:

In some methods the styling of timber is just like the styling of graphs—although there are some vital variations because of the hierarchical nature of timber. By default, choices inserted into a selected tree ingredient have an effect on solely that tree ingredient:

However you can provide guidelines that specify how components within the subtree beneath that ingredient are affected:

In Model 13.1 there’s now detailed management out there for styling each nodes and edges within the tree. Right here’s an instance that provides styling for mum or dad edges of nodes:

Choices like TreeElementStyle decide styling from the positions of components. TreeElementStyleFunction, alternatively, determines styling by making use of a operate to the info at every node:

This makes use of each information and place data for every node:

In analogy with VertexShapeFunction for graphs, TreeElementShapeFunction gives a basic mechanism to specify how nodes of a tree ought to be rendered. This named setting for TreeElementShapeFunction makes each node be displayed as a circle:

But Extra DateDealing with Particulars
We first launched dates into Wolfram Language in Model 2.0, and we launched trendy date objects in Model 10.0. However to essentially make dates absolutely computable, there are numerous detailed instances to think about. And in Model 13.1 we’re coping with one more of them. Let’s say you’ve bought the date January 31, 2022. What date is one month later—provided that there’s no February 31, 2022?
If we outline a month “bodily”, it corresponds to a sure fractional variety of days:

And, sure, we are able to use this to determine what’s a month after January 31, 2022:

Barely complicated right here is that we’re coping with date objects of “day” granularity. We are able to see extra if we go right down to the extent of minutes:

If one’s doing one thing like astronomy, this sort of “bodily” date computation might be what one needs. But when one’s doing on a regular basis “human” actions, it’s nearly definitely not what one needs; as a substitute, one needs to land on some calendar date or one other.
Right here’s the default within the Wolfram Language:

However now in Model 13.1 we are able to parametrize extra exactly what we wish. This default is what we name "RollBackward": wherever we “land” by doing the uncooked date computation, we “roll backward” to the primary legitimate date. An alternate is "RollForward":

No matter technique one makes use of, there are going to be bizarre instances. Let’s say we begin with a number of consecutive dates:

With "RollBackward" we’ve the weirdness of repeating February 28:

With "RollForward" we’ve the weirdness of repeating March 1:

Is there any various? Sure, we are able to use "RollOver":

This retains advancing by days, however then has the weirdness that it goes backwards. And, sure, there’s no “proper reply” right here. However in Model 13.1 now you can specify precisely what you need the conduct to be.
The identical situation arises not only for months, but in addition, for instance, for years. And it impacts not simply DatePlus, but in addition DateDifference.
It’s value mentioning that in Model 13.1, along with coping with the element we’ve simply mentioned, the entire framework for doing “date arithmetic” in Wolfram Language has been made vastly extra environment friendly, typically by components of tons of.
Capturing Video & Extra
We’ve had ImageCapture since Model 8.0 (in 2010) and AudioCapture since Model 11.1 (in 2017). Now in Model 13.1 we’ve VideoCapture. By default VideoCapture[] offers you a GUI that allows you to file out of your digital camera:

Clicking the down arrow opens up a preview window that exhibits your present video:

Whenever you’ve completed recording, VideoCapture returns the Video object you created:

Now you possibly can course of or analyze this Video object similar to you’ll every other:

VideoCapture[] is a blocking operation that waits till you’ve completed recording, then returns a outcome. However VideoCapture may also be used “not directly” as a dynamic management. Thus, for instance

helps you to asynchronously begin and cease recording, whilst you do different issues in your Wolfram Language session. However each time you cease recording, the worth of video is up to date.
VideoCapture information video out of your digital camera (and you should utilize the ImageDevice choice to specify which one in case you have a number of). VideoScreenCapture, alternatively, information out of your laptop display screen—in impact offering a video analog of CurrentScreenImage.
VideoScreenCapture[], like VideoCapture[], is a blocking operation so far as the Wolfram Language is worried. However if you wish to watch one thing occurring in one other software (say, an internet browser), it’ll just do effective. And as well as, you can provide a display screen rectangle to seize a selected area in your display screen:
✕
VideoScreenCapture[{{0, 50}, {640, 498}}] 
Then for instance you possibly can analyze the time collection of RGB coloration ranges within the video that’s produced:

What if you wish to display screen file from a pocket book? Effectively, then you should utilize the asynchronous dynamic recording mechanism that exists in VideoScreenCapture simply because it does in VideoCapture.
By the way in which, each VideoCapture and VideoScreenCapture by default seize audio. You may change off audio recording both from the GUI, or with the choice AudioInputDevice→None.
If you wish to get fancy, you possibly can display screen file a pocket book during which you’re capturing video out of your digital camera (which in flip exhibits you capturing a video, and so on.):
✕
VideoScreenCapture[EvaluationNotebook[]] 
Along with capturing video from realtime goingson, you may as well generate video instantly from capabilities like AnimationVideo and SlideShowVideo—in addition to by “touring” a picture utilizing TourVideo. In Model 13.1 there are some vital enhancements to TourVideo.
Take an animal scene and extract bounding bins for elephants and zebras:

Now you can also make a tour video that visits every animal:

Outline a path operate of a variable t:


Now we are able to use the trail operate to make a “spiralling” tour video:

School Calculus
Remodeling faculty calculus was one of many early achievements of Mathematica. However even now we’re persevering with so as to add performance to make faculty calculus ever simpler and smoother to do—and extra instantly connectable to purposes. We’ve at all times had the operate D for taking derivatives at a degree. Now in Model 13.1 we’re including ImplicitD for locating implicit derivatives.
So, for instance, it may discover the spinoff of x^{y} with respect to x, with y decided implicit by the constraint x^{2} + y^{2} = 1:

Miss the primary argument and also you’ll get the usual faculty calculus “discover the slope of the tangent line to a curve”:

To date all of this can be a pretty simple repackaging of our longstanding calculus performance. And certainly these sorts of implicit derivatives have been out there for a very long time in WolframAlpha. However for Mathematica and the Wolfram Language we wish all the things to be as basic as potential—and to help the sorts of issues that present up in differential geometry, and in issues like asymptotics and validation of implicit options to differential equations. So along with unusual collegelevel calculus, ImplicitD can do issues like discovering a second implicit spinoff on a curve outlined by the intersection of two surfaces:

In Mathematica and the Wolfram Language Combine is a operate that simply will get you solutions. (In WolframAlpha you possibly can ask for a stepbystep resolution too.) However notably for instructional functions—and typically additionally when pushing boundaries of what’s potential—it may be helpful to do integrals in steps. And so in Model 13.1 we’ve added the operate IntegrateChangeVariables for altering variables in integrals.
An instantaneous situation is that once you specify an integral with Combine[...], Combine will simply go forward and do the integral:

However for IntegrateChangeVariables you want an “undone” integral. And you may get this utilizing Inactive, as in:

And given this inactive type, we are able to use IntegrateChangeVariables to do a “trig substitution”:

The result’s once more an inactive type, now stating the integral otherwise. Activate goes forward and really does the integral:

IntegrateChangeVariables can cope with a number of integrals as nicely—and with named coordinate techniques. Right here it’s remodeling a double integral to polar coordinates:

Though the fundamental “structural” transformation of variables in integrals is sort of simple, the entire story of IntegrateChangeVariables is significantly extra difficult. “Schoollevel” modifications of variables are often rigorously organized to come back out simply. However within the extra basic case, IntegrateChangeVariables finally ends up having to do nontrivial transformations of geometric areas, tough simplifications of integrands topic to sure constraints, and so forth.
Along with altering variables in integrals, Model 13.1 additionally introduces DSolveChangeVariables for altering variables in differential equations. Right here it’s remodeling the Laplace equation to polar coordinates:

Generally a change of variables can simply be a comfort. However typically (assume Common Relativity) it may lead one to a complete totally different view of a system. Right here, for instance, an exponential transformation converts the same old Cauchy–Euler equation to a type with fixed coefficients:

Fractional Calculus
The primary spinoff of x^{2} is 2x; the second spinoff is 2. However what’s the spinoff? It’s a query that was requested (for instance by Leibniz) even within the first years of calculus. And by the 1800s Riemann and Liouville had given a solution—which in Model 13.1 can now be computed by the brand new FractionalD:

And, sure, do one other spinoff and also you get again the 1^{st} spinoff:

Within the extra basic case we’ve:

And this works even for adverse derivatives, in order that, for instance, the (–1)^{st} spinoff is an unusual integral:

It may be not less than as tough to compute a fractional spinoff as an integral. However FractionalD can nonetheless usually do it

although the outcome can shortly grow to be fairly difficult:

Why is FractionalD a separate operate, moderately than simply being a part of a generalization of D? We mentioned this for fairly some time. And the explanation we launched the express FractionalD is that there isn’t a novel definition of fractional derivatives. Actually, in Model 13.1 we additionally help the Caputo fractional spinoff (or differintegral) CaputoD.
For the spinoff of x^{2}, the reply continues to be the identical:

However as quickly as a operate isn’t zero at x = 0 the reply could be totally different:

CaputoD is a very handy definition of fractional differentiation when one’s coping with Laplace transforms and differential equations. And in Model 13.1 we are able to now solely compute CaputoD but in addition do integral transforms and clear up equations that contain it.
Right here’s a order differential equation

and a order one

in addition to a π^{th}order one:

Be aware the looks of MittagLefflerE. This operate (which we launched in Model 9.0) performs the identical type of position for fractional derivatives that Exp performs for unusual derivatives.
Extra Math—Some Lengthy Awaited
In February 1990 an inside bug report was filed towards the stillindevelopment Model 2.0 of Mathematica:

It’s taken a very long time (and comparable points have been reported many occasions), however in Model 13.1 we are able to lastly shut this bug!
Contemplate the differential equation (the Clairaut equation):

What DSolve does by default is to provide the generic resolution to this equation, when it comes to the parameter 𝕔_{1}. However the delicate level (which in optics is related to caustics) is that the household of options for various values of 𝕔_{1} has an envelope which isn’t itself a part of the household of options, however can also be an answer:

In Model 13.1 you possibly can request that resolution with the choice IncludeSingularSolutions→True:

And right here’s a plot of it:

DSolve was a brand new operate (again in 1991) in Model 2.0. One other new operate in Model 2.0 was Residue. And in Model 13.1 we’re additionally including an extension to Residue: the operate ResidueSum. And whereas Residue finds the residue of a fancy operate at a selected level, ResidueSum finds a sum of residues.
This computes the sum of all residues for a operate, throughout the entire advanced airplane:

This computes the sum of residues inside a selected area, on this case the unit disk:

Create Your Personal “Information to Features” Pages
An essential a part of the builtin documentation for the Wolfram Language are what we name “information pages”—pages like the next that set up capabilities (and different constructs) to provide an general “cognitive map” and abstract of some space:

In Model 13.1 it’s now simple to create your individual customized information pages. You may checklist builtin capabilities or different constructs, in addition to issues from the Wolfram Operate Repository and different repositories.
Go to the “root web page” of the Documentation Heart and press the icon:

You’ll get a clean customized information web page:

Fill within the information web page nevertheless you need, then use Deploy to deploy the web page both domestically, or to your cloud account. Both approach, the web page will now present up within the menu from the highest of the basis information web page (they usually’ll additionally present up in search):

You would possibly find yourself creating only one customized information web page to your favourite capabilities. Otherwise you would possibly create a number of, say one for every job or matter you generally cope with. Information pages aren’t about placing within the effort to create fullscale documentation; they’re way more lightweight, and aimed extra at offering fast (“what was that operate known as?”) reminders and “bigpicture” maps—leveraging all the precise operate and different documentation that already exists.
Visible Results & Beautification
At first it appeared like a minor characteristic. However as soon as we’d applied it, we realized it was way more helpful than we’d anticipated. Simply as you possibly can fashion a graphics object with its coloration (and, as of Model 13.0, its filling sample), now in Model 13.1 you possibly can fashion it with its drop shadowing:

Drop shadowing seems to be a pleasant strategy to “deliver graphics to life”

or to emphasise one ingredient over others:

It really works nicely in geo graphics as nicely:

DropShadowing permits detailed management over the shadows: what course they’re in, how blurred they’re and what coloration they’re:

Drop shadowing is extra difficult “beneath the hood” than one may think. And when potential it really works utilizing {hardware} GPU pixel shaders—the identical expertise that we’ve used since Model 12.3 to implement materialbased floor textures for 3D graphics. In Model 13.1 we’ve explicitly uncovered some wellknown underlying kinds of 3D shading. Right here’s a geodesic polyhedron (sure, that’s one other new operate in Model 13.1), with its floor normals added (utilizing the once more new operate EstimatedPointNormals):

Right here’s probably the most fundamental type of shading: flat shading of every aspect (and the specularity on this case doesn’t “catch” any sides):

Right here now could be Gouraud shading, with a somewhatfaceted glint:

After which there’s Phong shading, trying considerably extra pure for a sphere:

Ever since Model 1.0, we’ve had an interactive strategy to rotate—and zoom into—3D graphics. (Sure, the mechanism was a bit primitive 34 years in the past, however it quickly bought to roughly its trendy type.) However in Model 13.1 we’re including one thing new: the power to “dolly” right into a 3D graphic, imitating what would occur for those who really walked right into a bodily model of the graphic, versus simply zooming your digital camera:

And, sure, issues can get a bit surreal (or “treky”)—right here dollying in after which zooming out:
3D Voronoi!
There are some capabilities that—over the course of years—have been requested over and over. Prior to now these have included infinite undo, excessive dpi shows, a number of axis plots, and others. And I’m comfortable to say that almost all of those have now been taken care of. However there’s one—seemingly obscure—“straggler” that I’ve heard about for nicely over 25 years, and that I’ve really additionally wished myself fairly a couple of occasions: 3D Voronoi diagrams. Effectively, in Model 13.1, they’re right here.
Arrange 25 random factors in 3D:


Now make a Voronoi mesh for these factors:

To “see inside” we are able to use opacity:

Why was this so laborious? In a Voronoi there’s a cell that surrounds every authentic level, and consists of all over the place that’s nearer to that time than to every other. We’ve had 2D Voronoi meshes for a very long time:

However there’s one thing simpler concerning the 2D case. The problem just isn’t a lot the algorithm for producing the cells as it’s how the cells could be represented in such a approach that they’re helpful for subsequent computations. Within the 2D case every cell is only a polygon.
However within the 3D case the cells are polyhedra, and to make a Voronoi mesh we’ve to have a polyhedral mesh the place all of the polyhedra match collectively. And it’s taken us a few years to construct the massive tower of computational geometry essential to help this. There’s a considerably less complicated case based mostly purely on cells which can be at all times both simplices or hexahedra—that we’ve used for finiteelement options to PDEs for some time. However in a real 3D Voronoi that’s not sufficient: the cells could be any (convex) polyhedral form.
Listed below are the “puzzle piece” cells for the 3D Voronoi mesh we made above:

Reconstructing Geometry from Level Clouds
Choose 500 random factors inside an annulus:


Model 13.1 now has a basic operate reconstructing geometry from a cloud of factors:

(In fact, given solely a finite variety of factors, the reconstruction can’t be anticipated to be excellent.)
The operate additionally works in 3D:


ReconstructionMesh is a basic superfunction that makes use of quite a lot of strategies, together with prolonged variations of the capabilities ConcaveHullMesh and GradientFittedMesh that have been launched in Model 13.0. And along with reconstructing “strong objects”, it may additionally reconstruct lowerdimensional issues like curves and surfaces:

A associated operate new in Model 13.1 is EstimatedPointNormals, which reconstructs not the geometry itself, however regular vectors to every ingredient within the geometry:

New in Visualization
In each new model for the previous 30 years we’ve steadily expanded our visualization capabilities, and Model 13.1 isn’t any exception. One operate we’ve added is TernaryListPlot—an analog of ListPlot that conveniently plots triples of values the place what one’s attempting to emphasise is their ratios. For instance let’s plot information from our knowledgebase on the sources of electrical energy for various international locations:

The plot exhibits the “power combination” for various international locations, with those on the underside axis being these with zero nuclear. Inserting colours for every axis, together with grid strains, helps clarify methods to learn the plot:

More often than not plots are plotting numbers, or not less than portions. In Model 13.0, we prolonged capabilities like ListPlot to additionally settle for dates. In Model 13.1 we’re going a lot additional, and introducing the potential for plotting what quantity to purely symbolic values.
Let’s say our information consists of letters A by C:

How will we plot these? In Model 13.1 we simply specify an ordinal scale:

OrdinalScale helps you to specify that sure symbolic values are to be handled as if they’re in a specified order. There’s additionally the idea of a nominal scale—represented by NominalScale—during which totally different symbolic values correspond to totally different “classes”, however in no explicit order.
Representing Quantities of Chemical substances
Molecule lets one symbolically characterize a molecule. Amount lets one symbolically characterize a amount with items. In Model 13.1 we now have the brand new assemble ChemicalInstance that’s in impact a merger of those, permitting one to characterize a sure amount of a sure chemical.
This offers a symbolic illustration of 1 liter of acetone (by default at normal temperature and strain):

We are able to ask what the mass of this occasion of this chemical is:

ChemicalConvert lets us do a conversion returning explicit items:

Right here’s as a substitute a conversion to moles:

This instantly offers the quantity of substance that 1 liter of acetone corresponds to:

This generates a sequence of straightchain hydrocarbons:

Right here’s the quantity of substance equivalent to 1 g of every of those chemical substances:

ChemicalInstance helps you to specify not simply the quantity of a substance, but in addition its state, particularly temperature and strain. Right here we’re changing 1 kg of water at 4° C to be represented when it comes to quantity:

Chemistry as Rule Software: Symbolic Sample Reactions
On the core of the Wolfram Language is the summary thought of making use of transformations to symbolic expressions. And at some stage one can view chemistry and chemical reactions as a bodily instantiation of this concept, the place one’s not coping with summary symbolic constructs, however as a substitute with precise molecules and atoms.
In Model 13.1 we’re introducing PatternReaction as a symbolic illustration for courses of chemical reactions—in impact offering an analog for chemistry of Rule for basic symbolic expressions.
Right here’s an instance of a “sample response”:

The primary argument specifies a pair of “reactant” molecule patterns to be reworked into “product” molecule patterns. The second argument specifies which atoms during which reactant molecules map to which atoms during which product molecules. In the event you mouse over the ensuing sample response, you’ll see corresponding atoms “mild up”:

Given a sample response, we are able to use ApplyReaction to use the response to concrete molecules:

Listed below are plots of the ensuing product molecules:

The molecule patterns within the sample response are matched towards subparts of the concrete molecules, then the transformation is finished, leaving the opposite elements of the molecules unchanged. In a way it’s the direct analog of one thing like

the place the b within the symbolic expression is changed, and the result’s “knitted again” to fill in the place the b was.
You are able to do what quantities to numerous sorts of “chemical practical programming” with ApplyReaction and PatternReaction. Right here’s an instance the place we’re basically increase a polymer by successive nesting of a response:


It’s usually handy to construct sample reactions symbolically utilizing Wolfram Language “chemical primitives”. However PatternReaction additionally helps you to specify reactions as SMARTS strings:

PDEs for Rods, Rubber and Extra
It’s been a 25year journey, steadily growing our builtin PDE capabilities. And in Model 13.1 we’ve added a number of (admittedly considerably technical) options which were a lot requested, and are essential for fixing explicit sorts of realworld PDE issues. The primary characteristic is having the ability to arrange a PDE as axisymmetric. Usually a 2D diffusion time period can be assumed Cartesian:

However now you possibly can say you’re coping with an axisymmetric system, together with your coordinates being interpreted as radius and top, and all the things assumed to be symmetrical within the azimuthal course:

What’s essential about this isn’t simply that it makes it simple to arrange sure sorts of equations, but in addition that in fixing equations axial symmetry could be assumed, permitting way more environment friendly strategies for use:

Additionally in Model 13.1 is an extension to the strong mechanics modeling framework launched in Model 13.0. Simply as there’s viscosity that damps out movement in fluids, so there’s the same phenomenon that damps out movement in solids. It’s extra of an engineering story, and it’s often described when it comes to two parameters: mass damping and stiffness damping. And now in Model 13.1 we help this sort of socalled Rayleigh damping in our modeling framework.
One other phenomenon included in Model 13.1 is hyperelasticity. In the event you bend one thing like steel past a sure level (however not up to now that it breaks), it’ll keep bent. However supplies like rubber and foam (and a few organic tissues) can “bounce again” from principally any deformation.
Let’s think about that we’ve a sq. of rubberlike materials. We anchor it on the left, after which we pull it on the fitting with a sure drive. What does it do?
This defines the properties of our materials:

We outline variables for the issue, representing x and y displacements by u and v:

Now we are able to arrange our complete drawback, and clear up the PDEs for it for every worth of the drive:

Then one can plot the outcomes, and see the rubber being nonlinearly stretched:

There’s in the long run appreciable depth in our dealing with of PDEbased modeling, and our growing capability to do “multiphysics” computations that span a number of kinds of physics (mechanical, thermal, electromagnetic, acoustic, …). And by now we’ve bought practically 1000 pages of documentation purely about PDEbased modeling. And for instance in Model 13.1 we’ve added a monograph particularly about hyperelasticity, in addition to expanded our assortment of documented PDE fashions:
Interpretable Machine Studying
Let’s say you might have educated a machine studying mannequin and also you apply it to a selected enter. It offers you some outcome. However why? What have been the essential options within the enter that led it to that outcome? In Model 13.1 we’re introducing a number of capabilities that attempt to reply such questions.
Right here’s some easy “coaching information”:

We are able to use machine studying to make a predictor for this information:

Making use of the predictor to a selected enter offers us a prediction:

What was essential in making this prediction? The "SHAPValues" property launched in Model 12.3 tells us what contribution every characteristic made to the outcome; on this case v was extra essential than u in figuring out the worth of the prediction:

However what about basically, for all inputs? The brand new operate FeatureImpactPlot offers a visible illustration of the contribution or “affect” of every characteristic in every enter on the output of the predictor:

What does this plot imply? It’s principally exhibiting how usually there are what contributions from values of the 2 enter options. And with this explicit predictor we see that there’s a variety of contributions from each options.
If we use a unique technique to create the predictor, the outcomes could be fairly totally different. Right here we’re utilizing linear regression, and it seems that with this technique v by no means has a lot affect on predictions:

If we make a predictor utilizing a choice tree, the characteristic affect plot exhibits the splitting of affect equivalent to totally different branches of the tree:

FeatureImpactPlot offers a type of chicken’seye view of the affect of various options. FeatureValueImpactPlot offers extra element, exhibiting as a operate of the particular values of enter options the affect factors with these values would have on the ultimate prediction (and, sure, the precise factors plotted listed here are based mostly on information simulated on the idea of the distribution inferred by the predictor; the precise information is often too massive to wish to carry round, not less than by default):

CumulativeFeatureImpactPlot offers a visible illustration of how “successive” options have an effect on the ultimate worth for every (simulated) information level:

For predictors, characteristic affect plots present affect on predicted values. For classifiers, they present affect on (log) possibilities for explicit outcomes.
Mannequin Predictive Management
One space that leverages many algorithmic capabilities of the Wolfram Language is management techniques. We first began growing management techniques performance greater than 25 years in the past, and by Model 8.0 ten years in the past we began to have builtin capabilities like StateSpaceModel and BodePlot particularly for working with management techniques.
Over the previous decade we’ve progressively been including extra builtin management techniques capabilities, and in Model 13.1 we’re now introducing mannequin predictive controllers (MPCs). Many easy management techniques (like PID controllers) take an advert hoc strategy during which they successfully simply “watch what a system does” with out attempting to have a selected mannequin for what’s happening contained in the system. Mannequin predictive management is about having a selected mannequin for a system, after which deriving an optimum controller based mostly on that mannequin.
For instance, we might have a statespace mannequin for a system:

Then in Model 13.1 we are able to derive (utilizing our parametric optimization capabilities) an optimum controller that minimizes a sure set of prices whereas satisfying explicit constraints:

The SystemsModelControllerData that we get right here accommodates quite a lot of components that permit us to automate the management design and evaluation workflow. For example, we are able to get a mannequin that represents the controller working in a closed loop with the system it’s controlling:

Now let’s think about that we drive this complete system with the enter:

Now we are able to compute the output response for the system, and we see that each output variables are pushed to zero by the operation of the controller:

Throughout the SystemsModelControllerData object generated by ModelPredictiveController is the precise controller computed on this case—utilizing the brand new assemble DiscreteInputOutputModel:

What really is that this controller? Finally it’s a group of piecewise capabilities that is dependent upon the values of states x_{1}[t] and x_{2}[t]:

And this exhibits the totally different statespace areas during which the controller has:

Algorithmic and Randomized Quizzes
In Model 13.0 we launched our query and evaluation framework that permits you to writer issues like quizzes in notebooks, along with evaluation capabilities, then deploy these to be used. In Model 13.1 we’re including capabilities to allow you to algorithmically or randomly generate questions.
The 2 new capabilities QuestionGenerator and QuestionSelector allow you to specify inquiries to be generated in response to a template, or randomly chosen from a pool. You may both use these capabilities instantly in pure Wolfram Language code, or you should utilize them by the Query Pocket book authoring GUI.
When you choose Insert Query within the GUI, you now get a selection between Fastened Query, Randomized Query and Generated Query:

Choose Randomized Query and also you’ll get

which then permits you to enter questions, and ultimately produce a QuestionSelector—which is able to choose newly randomized questions for each copy of the quiz that’s produced:

Model 13.1 additionally introduces some enhancements for authoring questions. An instance is a pureGUI “nocode” strategy to specify multiplechoice questions:

The ExprStruct Information Construction
Within the Wolfram Language expressions usually have two facets: they’ve a construction, they usually have a which means. Thus, for instance, Plus[1,1] has each a particular tree construction

and has a price:

Within the regular operation of the Wolfram Language, the evaluator is robotically utilized to all expressions, and basically the one strategy to keep away from analysis by the evaluator is to insert “wrappers” like Maintain and Inactive that essentially change the construction of expressions.
In Model 13.1, nevertheless, there’s a brand new strategy to deal with “unevaluated” expressions: the "ExprStruct" information construction. ExprStructs characterize expressions as uncooked information constructions which can be by no means instantly seen by the evaluator, however can however be structurally manipulated.
This creates an ExprStruct equivalent to the expression {1,2,3,4}:

This structurally wraps Complete across the checklist, however does no analysis:

One may see this by “visualizing” the info construction:

Regular takes an ExprStruct object and converts it to a standard expression, to which the evaluator is robotically utilized:

One can do quite a lot of basically structural operations instantly on an ExprStruct. This is applicable Plus, then maps Factorial over the ensuing ExprStruct:

The result’s an ExprStruct representing an unevaluated expression:

With "MapImmediateEvaluate" there’s an analysis carried out every time the mapping operation generates an expression:

One highly effective use of ExprStruct is in doing code transformations. And in a typical case one would possibly wish to import expressions from, say, a .wl file, then manipulate them in ExprStruct type. In Model 13.1 Import now helps an ExprStructs import ingredient:

This selects expressions that correspond to definitions, within the sense that they’ve SetDelayed as their head:

Right here’s a visualization of the primary one:

TremendousEnvironment friendly CompilerPrimarily based Exterior Code Interplay
Let’s say you’ve bought exterior code that’s in a compiled Ccompatible dynamic library. An essential new functionality in Model 13.1 is a superefficient and really streamlined strategy to name any operate in a dynamic library instantly from throughout the Wolfram Language.
It’s one of many accelerating stream of developments which can be being made potential by the largescale infrastructure buildout that we’ve been doing in reference to the brand new Wolfram Language compiler—and particularly it usually leverages our subtle new typehandling capabilities.
As a primary instance, let’s contemplate the RAND_bytes (“cryptographically safe pseudorandom quantity generator”) operate in OpenSSL. The C declaration for this operate is:
In Model 13.1 we now have a symbolic strategy to characterize such a declaration instantly within the Wolfram Language:

(Basically we’d additionally must specify the library that this operate is coming from. OpenSSL occurs to be a library that’s loaded by default with the Wolfram Language so that you don’t want to say it.)
There are fairly a couple of new issues happening within the declaration. First, as a part of our assortment of compiled varieties, we’re including ones like "CInt" and "CChar" that check with uncooked C language varieties (right here int and char). There’s additionally CArray which is for declaring C arrays. Discover the brand new ::[ ... ] syntax for TypeSpecifier that permits compact specs for parametrized varieties, just like the char* right here, that’s described in Wolfram Language as "CArray"::["CChar"].
Having arrange the declaration, we now have to create an precise operate that may take an argument from Wolfram Language, convert it to one thing appropriate for the library operate, then name the library operate, and convert the outcome again to Wolfram Language type. Right here’s a approach to try this on this case:

What we get again is a compiled code operate that we are able to instantly use, and that works by very effectively calling the library operate:

The FunctionCompile above makes use of a number of constructs which can be new in Model 13.1. What it basically does is to take a Wolfram Language integer (which it assumes to be a machine integer), forged it right into a C integer, then go this to the library operate, together with a specification of a C char * into which the library operate will put its outcome, and from which the ultimate Wolfram Language outcome might be retrieved.
It’s value emphasizing that many of the complexity right here has to do with dealing with information varieties and conversions between them—one thing that the Wolfram Language goes to plenty of hassle to keep away from often exposing the consumer to. However after we’re connecting to exterior languages that make elementary use of varieties, there’s no selection however to cope with them, and the complexity they contain.
Within the FunctionCompile above the primary new assemble we encounter is

The essential function of that is to create the buffer into which the exterior operate will write its outcomes. The buffer is an array of bytes, declared in C as char *, or right here as "CArray"::["CChar"]. There’s an precise wrinkle although: who’s going to handle the reminiscence related to this array? The "Managed":: sort specifier says that the Wolfram Language wrapper will do reminiscence administration for this object.
The following new assemble we see within the FunctionCompile is

Solid is one in every of a household of latest capabilities that may seem in compilable code, however haven’t any significance exterior the compiler. Solid is used to specify that information ought to be transformed to a type per a specified sort (right here a C int sort).
The core of the FunctionCompile is using LibraryFunction, which is what really calls the exterior library operate that we declared with the library operate declaration.
The final step within the operate compiled by FunctionCompile is to extract information from the C array and return it as a Wolfram Language checklist. To do that requires the brand new operate FromRawPointer, which really retrieves information from a specified location in reminiscence. (And, sure, this can be a uncooked dereferencing operation that may trigger a crash if it isn’t carried out accurately.)
All of this will at first appear moderately difficult, however for what it’s doing, it’s remarkably easy—and significantly leverages the entire symbolic construction of the Wolfram Language. It’s additionally value realizing that on this explicit instance, we’re simply dipping into compiled code after which returning outcomes. In largerscale instances we’d be doing many extra operations—usually specified instantly by toplevel Wolfram Language code—inside compiled code, and so sort declaration and conversion operations can be a smaller fraction of the code we’ve to put in writing.
One characteristic of the instance we’ve simply checked out is that it solely makes use of builtin varieties. However in Model 13.1 it’s now potential to outline customized varieties, such because the analog of C structs. For example, contemplate the operate ldiv from the C normal library. This operate returns an object of sort ldiv_t, outlined by the next typedef:
Right here’s the Wolfram Language model of this declaration, based mostly on establishing a "Product" sort named "CLDivT":

(The "ReferenceSemantics"→False possibility specifies that this kind will really be handed round as a price, moderately than only a pointer to a price.)
Now the declaration for the ldiv operate can use this new customized sort:

The ultimate definition of the decision to the exterior ldiv operate is then:

And now we are able to use the operate (and, sure, it is going to be as environment friendly as if we’d instantly written all the things in C):

The examples we’ve given listed here are very small ones. However the entire construction for exterior operate calls that’s now in Model 13.1 is about as much as deal with massive and complicated conditions—and certainly we’ve been utilizing it internally with nice success to arrange essential new builtin items of the Wolfram Language.
One of many components that’s usually wanted in additional advanced conditions is extra subtle reminiscence administration, and our new "Managed" sort gives a handy and streamlined approach to do that.
This makes a compiled operate that creates an array of 10,000 machine integers:

Working the operate successfully “leaks” reminiscence:

However now outline a model of the operate during which the array is “managed”:

Now the reminiscence related to the array is robotically freed when it’s not referenced:

Instantly Compiling Operate Definitions
If in case you have an specific pure operate (Operate[...]) you should utilize FunctionCompile to provide a compiled model of it. However what in case you have a operate that’s outlined utilizing downvalues, as in:

In Model 13.1 you possibly can instantly compile operate definitions like this. However—as is the character of compilation—you might have declare what varieties are concerned. Here’s a declaration for the operate fac that claims it takes a single machine integer, and returns a machine integer:

Now we are able to create a compiled operate that computes fac[n]:

The compiled operate runs considerably quicker than the unusual symbolic definition:


The flexibility to declare and use downvalue definitions in compilation has the essential characteristic that it permits you to write a definition simply as soon as, after which use it each instantly, and in compiled code.
Manipulating Expressions in Compiled Code
An early focus of the Wolfram Language compiler is dealing with lowlevel “machine” varieties, equivalent to integers or reals of sure lengths. However one of many advances within the Model 13.1 compiler is direct help for an "InertExpression" sort for representing any Wolfram Language expression inside compiled code.
Whenever you use one thing like FunctionCompile, it can explicitly attempt to compile no matter Wolfram Language expressions it’s given. However for those who wrap the expressions with InertExpression the compiler will then simply deal with the expressions as inert structural objects of sort "InertExpression". This units up a compiled operate that constructs an expression (implicitly of sort "InertExpression"):

Evaluating the operate constructs after which returns the expression:

Usually, throughout the compiler, an "InertExpression" object might be handled in a purely structural approach, with none analysis (and, sure, it’s carefully associated to the "ExprStruct" information construction). However typically it’s helpful to carry out analysis on it, and you are able to do this with InertEvaluate:

Now the InertEvaluate does the analysis earlier than wrapping Maintain across the inert expression:

The flexibility to deal with expressions instantly within the compiler would possibly seem to be some type of element. But it surely’s really vastly essential in opening up prospects for future improvement with the Wolfram Language. For the previous 35 years, we’ve internally been capable of write lowlevel expression manipulation code as a part of the C language core of the Wolfram Language kernel. However the capability of the Wolfram Language compiler to deal with expressions now opens this up—and lets anybody write maximally environment friendly code for manipulating expressions that interoperate with all the things else within the Wolfram Language.
And Nonetheless Extra…
Even past all of the issues I’ve mentioned up to now, there are all types of additional additions and enhancements in Model 13.1, dotted all through the system.
InfiniteLineThrough and CircularArcThrough have been added for geometric computation, and geometric scene specification. Geometric scenes can now be styled for customized presentation:

There are new graph capabilities: GraphProduct, GraphSum and GraphJoin:

And there are new builtin households of graphs: TorusGraph and BuckyballGraph:

You may combine pictures instantly into Graphics (and Graphics3D):

AbsoluteOptions now resolves many extra choices in Graphics, telling you what specific worth was used once you gave an possibility simply as Automated.
The operate LeafCount now has a Heads possibility, to depend expression branches inside heads. Splice works with any head, not simply Checklist. Features like IntersectingQ now have SameTest choices. You may specify TimeZone choices utilizing geographic entities (like cities).
FindClusters now helps you to specify precisely what number of clusters you wish to partition your information into, in addition to supporting UpTo[n].
In neural nets, ElementwiseLayer now helps “trendy” nonconvex nonmonotonic activation capabilities like Mish and GELU, AttentionLayer helps dropout and native masking, ReplicateLayer now helps integer arrays, and RandomArrayLayer helps additional statistical distributions. NetTrain now handles multioutput and nonscalar losses. Picture encoders and decoders help resampling and padding, and there’s now help for nuclear sampling. Our help for the ONNX switch format continues to develop, with web operators added in Model 13.1.
CenteredInterval—launched in Model 13.0—now helps 36 additional particular capabilities (and, sure, each wants theorems proved to make this work).
There’ll be extra approaching this in subsequent variations, however in Model 13.1 we’re starting the introduction of structured matrices which can be saved and computed with in particular, optimized methods. Examples embrace PermutationMatrix and LowerTriangularMatrix.
We’ve had in depth help for computational microscopy for some time. However in Model 13.1 the "BioImageFormat" Import format now provides importing of the greater than 160 uncooked picture codecs utilized by totally different sorts of microscopes.
Model 13.0 dramatically expanded our capability to import PDF. We’ve additional enhanced this in Model 13.1, for instance permitting positioned textual content to be imported into graphics as Textual content primitives.
We’ve supported normal textual content kinds like daring and italic endlessly, however now we’ve a normal strategy to deal with struckthrough textual content as nicely:

Along with all these “insidethesystem” enhancements, we’ve additionally completed making it potential to obtain desktop variations of Wolfram Language on all platforms (together with Linux) whereas leaving documentation on the internet. Documentation set up may now be configured globally on a permachine foundation, moderately than simply on a peruser foundation.
So—as of at this time, documentation or not—you may get Model 13.1 in your laptop. Oh, and the Wolfram Cloud has additionally now been up to date to make use of Model 13.1. I hope you get pleasure from the brand new options, and this subsequent step on the epic journey of Mathematica and the Wolfram Language.
[ad_2]