Tumgik
#or transpose from key of x to key of y
queerscout · 1 year
Text
I’ve decided to learn the clarinet and on one hand wow learning an instrument as an adult is so much easier than as a kid??  I’m two days in and feel confident with the fingerings of two octaves of G major (tone is... a work in progress).
On the other hand, coming from a flute, the register key and the SHEER NUMBER of keys on this thing break my brain.  Why can I press two different sets of keys to open or close the same exact holes?  I mean?  Sure, it’s convenient while my hands are learning what to do but also WHY was that necessary in the first place?  And this whole tipping your hand to cover one hole but hit a different key simultaneously?  Then there’s the register key nonsense?  You mean you use the same fingerings, plus opening this tiny little hole, to raise the pitch??
17 notes · View notes
kimistorm · 3 years
Text
Dancing in the rain [Bang Chan]
Fandom: Stray Kids
Pairing: [Chan x GN! Reader]
Warnings: mentions of thunder, but it's depicted in a more serene view and not scary
Word count: 1.5 k
Requested by: SKZ8BLACKPINK4 on Quotev
Prompts: “Thanks, it's the insomnia."
Tumblr media
Chan blinked awake when he heard the soft closing of the door to your shared room. He tiredly reached a hand towards your side of the bed and came up empty, but still slightly warm. It was one of those miraculous nights where the two of you were able to fall asleep at the same time, but it seemed like you had other plans.
He let out a yawn before sitting up in bed and looking around through nearly-closed eyes as he tried to figure out what was happening. It was still dark, and turning on his phone (through even more squinted eyes) told him it was a little after 4 am. He let out a groan from the fatigue still weighing down on his bones before falling face-first into his pillows. He was sure you’d come back, you were probably just getting water.
His eyes closed fully and he readjusted himself to get more comfortable, but it was no good. His mind was already awake. Felix said he was going to swing by later to drop off some cookies. And he needed to pester Changbin about sending him a sample of the rap he was putting together. What were you doing? He couldn’t hear the sounds of anything awake in the rest of your shared apartment. A cool beat started playing in his head and he tried to commit it to memory to transpose once he woke up. Maybe it was a beat that Han could use, he could imagine Han doing some lyrical rapping to it. Were you okay? It’s been a while. Oh, and is that thunder outside?
A groan of defeat emitted from his throat as he sat up fully and pulled himself out of bed. If he was awake, may as well try and do something productive. The carpeted floor was soft against his feet as he pulled open the door to the bedroom and headed out into the living area. He was going to head into the kitchen to grab a drink, but he noticed the door to the patio open, and that stopped him in his tracks.
He nearly let out a shriek when he saw someone on the balcony, but his brain caught up at the last second and realized it was just you. There was a warm smile on his face as he crossed the living room to see you on the balcony. Aided by the fact you left the door open, he silently slipped his arms around you and rested his chin on your shoulder in a back hug. “Sorry,” you rubbed at his arms that were around your waist, “did I wake you up?”
“It’s okay,” he murmured back in his warm, low voice. “What are you doing out here? It’s like 4 am.”
“As if you can argue against that,” you teased with a smile, “Mr. disastrous sleep schedule.”
He rolled his eyes at your jab, “thanks, it’s the insomnia. But,” he pressed a quick kiss to your cheek, “I’m still asking.” There was a flash of light followed by a rolling rumble that made him realize that yes. It was thunder storming outside.
“It’s raining.” He hummed in agreeance and nuzzled his face into your neck, “it’s wonderful.” You took in a deep breath and smiled as you inhaled the wet earth smell. The gentle pittering of the rain interrupted by thunder was all you could hear as down below, there was hardly a car on the street. There was another flash of lightning and you gasped as you saw it fork through the sky before there was a crack of thunder. “I kind of want to dance in it.”
That got Chan to open his eyes fully as he looked at you with concern and shock barely illuminated on his face. “You want to dance in it?” he repeated, not sure that he heard it in his still-waking-up state.
You stuck a hand beyond the balcony to let raindrops cover it and you were pleased to find it cool, but not hard. “It’s not bad. And tomorrow’s Saturday anyway.” Chan himself wasn’t super keen on getting wet in his clothes, especially because it was cold and he was definitely not going to wear swim trunks out there. “Do you want to dance with me?” But he was totally whipped for you. “It’ll be less than five minutes...probably.” He couldn't say no.
“Right now?” he couldn’t help but ask as he glanced down at his outfit to confirm that he was shirtless and in a pair of sweatpants.
You glanced at him with a bit of a spark in your eye as you nudged him, “after you put on a shirt.” You turned back to the rain wistfully, “it might pass soon.”
He gently tugged you back inside and you looked at him in confusion, “grab your keys.” You grinned at his small confirmation as you hurried to put on one of the light jackets you had left on the couch and your keys from the little box at the door. Chan emerged from your room with a light jacket covering his bare chest and the two of you slipped out of your apartment.
When you got outside, you let out a giggle of delight as you spun around under the downpour. It may have gotten harder, but you didn’t mind. Your sandal-clad feet stepped in several puddles and your joyful shouts seemed out of place in the otherwise silence of the night. You noticed Chan still standing apprehensively under the overhang in front of your apartment complex, “may I have this dance?” you asked as you extended a hand out to him.
His face melted into a smile as he looked at you nearly perfectly illuminated under one of the streetlights. “Can’t leave you hanging, can I?” he asked as he confidently stepped out to take your hand, “that wouldn’t be very gentlemanly of me.”
You smiled as the two of you did a janky version of the waltz under the falling rain. Your feet were soaked as you breezed through puddles and the sky continued to be punctuated with bursts of lightning and thunder. “Thanks Chan.” You murmured as you leaned into his chest, the two of you swaying in place.
“Anything for you babe.”
It was a gust of wind that caused you to sneeze that finally pulled the two of you back into the safety of the building. Your fingers were cold and clumsy as you shoved the key into the lock to gain access to your apartment and you couldn’t help but burst into laughter once the door shut.
“What?” Chan asked gently as he pulled you into another wet hug.
“Nothing.” You smiled, “just happy.”
Chan felt a smile grow on his face as well, your joy was just infectious like that. “I’m glad you’re happy.” He started pulling the two of you to the bathroom, “but let’s get dried off.”
It wasn’t long until the two of you were in dry, warm clothes and back in bed. You cooed as you ran your hand through his hair, as it got curly from the rain and the somewhat aggressive towel dry. “Felix is coming by later today.” Chan reported as the two of you cuddled and began to drift into sleep.
“K.”
Luckily for Chan, he was able to fall asleep. Though it felt like only minutes as he was woken up by rapid knocking and his phone vibrating wildly.
“Is that Felix?” you squinted as you stared at his phone. Your room was a lot brighter now that the sun had risen.
“Probably.” Chan groaned as he pulled himself out of bed. “You can go back to bed,” he protested when he noticed you were also crawling out.
“I’ll make breakfast.” You smiled as the two of you emerged from your room to answer the door.
“Took you long enough,” Changbin couldn’t help but bite in annoyance when Chan finally opened the door.
"You're not Felix."
Changbin scoffed, "feeling the welcome."
“Sorry,” Chan replied as he rubbed his face. "I thought Felix was coming over today."
“He is," Changbin agreed, "he was baking enough cookies to feed an army earlier today. Hey (y/n)!” your guest waved his hand as he noticed you in the kitchen making coffee.
“Hey Changbin, how’ve you been?”
“Did the two of you just wake up?” he asked as he looked between the two of you with slight eye bags and general ‘I just woke up’ vibes.
“Yeah.” Chan replied as he searched through the apartment for his laptop.
“Rough night?” Changbin asked with a smirk.
“Seo Changbin!” you shouted and threw a pot holder at him who easily ducked and ran away laughing.
Masterlist
AN: Oops, I did not mean to disappear for over a month. I'm sorry!! Haha but inspiration struck (as usual) in the middle of the night, so apologies if this is a little messy as it's completely unedited (I didn't want to leave you guys hanging ^^;). On another note more related to skz, NOEASY! So excited!! I'm loving Felix's big hat and Chan's been getting way too comfortable with crop tops. Are you sure you're foive sir?? xD
122 notes · View notes
changeling-rin · 3 years
Text
Tumblr media
Okay... you have no idea what a complicated question you just asked.  Therefore, I’m gonna give this one it’s own post, for simplicity’s sake.
So!  My language ciphers are all English-based, because that’s my native language, and are usually constructed based on the placement of keyboard letters.  (I say usually, because there’s at least two exceptions.)  
.............................................................................................................................
That said: Hello and Welcome to a Quick Guide to the DL Languages!
Labrynnian - Use the phonetic alphabet, but backwards.  Insert an ā after letters a-e, ē for f-j, ī for k-o, ō for p-t, and ū for u-z.  In case of excessive vowels, insert a ł. There is a specific verb-possessive relationship, however, where things like 'My name is Changeling' becomes 'Name-my is Changeling'
Holodese - Move one key to the right.  If at any point it seems like you'd trip over your tongue trying to pronounce the result, insert an apostrophe to give your mouth a breather.  Only one apostrophe per word since more than that makes it hard to read, for all other necessary mouth-break moments use an ᾳ as a filler vowel.  In case of excessive vowels, use an ӽ.  Contractions and possessives are indicated with a hyphen in place of the English apostrophe.
Subrosian - Move one key to the right, except in the case of vowels.  All vowels stay the same.  The apostrophe and hyphen rules from Holodese also apply here, though much less frequently - same with the filler consonant and vowel if necessary.  Subrosian is intended as a subset of Holodese, hence the similarities.
Koholish - Move one key to the left.  Insert an 'e' between any letters that do not make phonetic sense.  In the case of an 'r' transposing into an 'e', that 'e' becomes an ë in order to differentiate.  There are no contractions in this one, so words like don't become do not, except that it's actually 'donot'.  Possessives are in progress.
Divinet - Take your sentence, remove all the consonants.  You are left with the vowels.  Fill in the former consonants with either 'w' or 'y', depending on whether or not it was a hard or a soft sound. Hard Sounds: B, C, D, K, L, M, N, P, Q, T, V, X.   Soft Sounds: F, G, H, J, R, S, Z, TH, SH.  For any 'y' that is pre-existing, use a ẏ to differentiate; same for 'w' with ẇ.  Filler vowel is æ, filler consonant is ḣ.
Darkling - Take your words and rearrange them by their alphabetical pronunciation.  Apply the Koholish Rule of playing nice with phonetics, but add õ instead.  In the event of excessive vowels, insert an ñ to make them play nice.  
Archaic - Might be Ancient Hylian, haven't decided yet.  Take the alphabet and numerize it, then apply those numbers to the keyboard.  1 is Q, 2 is W, 3 is E, etc.  Then take your word, take the alphabet number of those letters, and apply those numbers to the above-established keyboard pattern.  Add û between anything that doesn't make phonetic sense (The Koholish Rule)
.............................................................................................................................
Not all of these have been used in DL yet, for various reasons, but I’m sure you recognize the ones which have.  The only languages that I don’t have ciphers for yet are Firetongue, which thus far consists of growling noises and is understandably hard to craft for, and Jabber (the Picori language) which has an in-game cipher of sorts that I’ve yet to crack completely.  
Also, because I know that the words ‘Alphabetical Alphabet’ are confusing, here is the appropriate reference.
Aich - H  Arr - R Ay - A Aye - I Bee - B Dee - D Djee - G Double-you - W Ee - E Eff - F El - L Em - M En - N Ess - S Ex - X Jay - J Kay - K Kew - Q Oh - O Pee - P See - C  Tee - T Vee - V Wye - Y You - U Zee - Z
Hope that answers your questions!
44 notes · View notes
doctorload308 · 3 years
Text
Birt Pojo Data Source
Tumblr media
Eclipse Birt Pojo Data Source
Birt Report Pojo Data Source Example
Birt Pojo Data Source Example
Birt Pojo Data Source Examples
Use esProc with BIRT. Here is the SPL script. Your BIRT reports can have a query from two data sources no matter what kind of database and go on other computations that are not convenient on BIRT.
Using POJO DataSource in BIRT 4.3 To create a report in BIRT 4.3 we can use POJO dataSource. In 4.3 this DataSource is supported. To use this we need to create a dataset class.
BIRT is an open source engine to create data visualizations that can be integrated into Java web applications. It's a top-level software project within the Eclipse Foundation and leverages contributions by IBM and Innovent Solutions. It was started and sponsored by Actuate at the end of 2004.
Eclipse Birt Pojo Data Source
Primary tabs
. Data - Databases, web services, Java objects all can supply data to your BIRT report. BIRT provides JDBC, XML, Web Services, and Flat File support, as well as support for using code to get at other sources of data. BIRTs use of the Open Data Access (ODA) framework allows anyone to build new UI and runtime support for any kind of tabular data.
= unsolved/reopened
BIRT (146)Build (4)
101416 Incorrect font format in BIRT Developer Guide in Help menu (closed/fixed)
103303 The chart engine.jar in the runtime distribution is the wrong file (closed/fixed)
105638 Rotated text report item displays in palette by default (closed/fixed)
106247 Eclpse Distro Path (closed/fixed)
Chart (46)
102417 Chart title is set to Chinese, can not be previewed correctly.(resolved/wontfix)
103130 Chart title is overlapped when previewed.(closed/worksforme)
103143 Data format doesn't work in Chart (closed/invalid)
103153 Scatter chart, if tick transposed, image does not be changed.(closed/fixed)
103168 X Axis data is lost when transposed.(closed/fixed)
103298 X series in a pie chart in wrong order (resolved/fixed)
103438 Scatter chart is displayed differently when it is transferred from another chart type.(closed/fixed)
103439 Steps in chart builder can't be restored to default setting when cleared out.(closed/fixed)
103453 Scale for 'datetime' type in chart builder doesn't work.(closed/fixed)
103460 Multiple x axis are not supported.(closed/fixed)
103463 Datetime marker line can't be set.(closed/worksforme)
103595 Datetime data in Chart axis of example are displayed inconsistently in layout.(closed/invalid)
103599 Resizing chart results in Eclipse hang up.(closed/fixed)
103602 Exception is thrown when setting chart height or width.(closed/worksforme)
103613 Linking chart by parameter causes error when a NULL param value is reached (resolved/fixed)
103617 if Label positioin is changed, then can not return initial state.(closed/fixed)
103618 Bar Chart , Label position is lack of inside item.(closed/fixed)
103770 don't use hyperlink (resolved/invalid)
103780 Chart is not displayed in layout view when transposed.(closed/fixed)
103782 Attached chart design file can't be previewed.(closed/fixed)
103787 Add a new Y-axis and set it's title to visible will cause chartException.(closed/fixed)
103960 If x axis type is 'Linear', scale should be grayed out.(closed/fixed)
103961 Marker and line doesn't work for X Axis.(closed/fixed)
103963 If there is no data for series, it should pop up a friendly error message to remind.(closed/fixed)
104248 Axis types on Chart dialog are not displayed in localized language.(verified/fixed)
104252 Sort option on Chart X-Series dialog is not displayed in localized language.(verified/fixed)
104254 Type and Action value on Chart Y-Series are not displayed in localized language.(verified/fixed)
104278 Values in Tick Style list box are not displayed in localized language.(verified/fixed)
104283 Value for Label Position on Chart dialog are not displayed in localized language.(verified/fixed)
104290 Hard coded strings on Chart Attributes>Y Series dialog (verified/fixed)
104313 Set the image to the chart label background, system throws exception (closed/fixed)
104315 Plot background image can not always take effort .(closed/worksforme)
104450 If plot background is set, data set binding is lost.(closed/fixed)
104465 Data values of Y-series cannot be displayed correctly (closed/invalid)
104613 Steps changed after chart is transposed.(closed/invalid)
104628 Chart Major.Minor Grid line style won't display in layout (closed/wontfix)
104631 If set a long title to chart X Axis,Axis type will be truncated (closed/fixed)
99331 Eclipse hangs when closing 'Chart Dialog' (resolved/fixed)
100746 Whole chart should display smaller on scale, not only display title and legend after resize (closed/invalid)
101039 Series colors do not have different default values (closed/fixed)
101179 Unfriendly error message display when preview chart with invalid data set (closed/fixed)
101806 Chart axis label background is not displayed properly in layout view.(closed/fixed)
101827 Exception should be processed before written to error log or some error message should pop up to warn user (closed/fixed)
101855 Chart title top.bottom inset should display right in layout view (closed/fixed)
101868 series value format can display as setting (closed/fixed)
102455 Pie Chart is not round (closed/fixed)
Data (22)
94542 Grouping versus Sorting (closed/invalid)
99479 After Update Birt 1.0 error Cannot load JDBC Driver class (resolved/fixed)
101553 Report Parameters are not working (resolved/duplicate)
101864 NullPointerException throws out when setting the parameter type as auto (closed/fixed)
101865 Try to set report parameter's value in beforeOpen method of data source,error occurred when save but preview was correct.(closed/duplicate)
103135 Change the name of one computed column which are used in dataset filter will cause the dataset filter invalid.(closed/fixed)
103151 When a data set parameter is generated automatically, error message doesn't always pop up.(closed/fixed)
103292 No error message when group key dismatches the interval (closed/fixed)
103296 Data set column doesn't work when it is entered by keyboard in data set filter page.(closed/fixed)
103346 Weekly interval groups by 7 day increments, not by week (resolved/fixed)
103625 Database URL will be refreshed when editing JAR files in Manage drivers dialog (closed/fixed)
104174 If I re-select csv file name, columns selected before in right pane disappeared.(closed/fixed)
104178 Linux:No file listed for '*.*' filter when new a flat file data source (closed/fixed)
104185 An additional column is generated when create a script data set (closed/fixed)
104204 Test connection fail when try to connect birt derby sample db.(closed/fixed)
104397 JDBC Data Set Editor showing empty.system schemas (resolved/fixed)
104452 IllegalArgumentException thrown out when double click on data set after change flatfile data source charset (closed/fixed)
104578 German labels are truncated on Manage JDBC drivers dialog.(verified/fixed)
104611 Smoke Test: Jdbcodbc data source can't be connected.(closed/fixed)
104616 A sql statement with parameter can not be changed if you go out of 'Edit DataSet'->'Query' (closed/fixed)
106250 POJO Data Source (closed/fixed)
103802 Aggregate function in a group footer using Total.OVERALL fails (resolved/fixed)
Data Access (16)
99872 Implementing the ODA Log Configuration in BIRT ODA Drivers (resolved/fixed)
100090 JDBC driver loaded either by data explorer or report viewer (resolved/fixed)
100495 'next' button is grayed out in 'new data source' window when create a data source.(closed/fixed)
100501 NPE thrown out when new a flat file data set (closed/fixed)
101185 NullPointerException thrown out after click on Finish in data set dailog (closed/fixed)
101372 Limit the data set to a particular schema for JDBC connection (resolved/fixed)
102405 Broken display names when Qry has Dup col names (resolved/fixed)
103531 Change data set type from Flatfile Data Set to Flat File Data Set (resolved/fixed)
103533 Change Flatfile Data Source to Flat File Data Source (resolved/fixed)
103544 Allow any filename extension for CSV files (resolved/fixed)
103598 Flat file - Use second line as data type indicator only works for String (resolved/invalid)
103600 Change spelling in error message (resolved/fixed)
103942 Cannot create a JDBC connection (resolved/invalid)
104306 ODA Consumer handling of a null argument for IQuery.prepare (resolved/fixed)
104630 Column icons don't show up in connecting derby database (closed/fixed)
105112 ODA Log Configuration's Optional Attributes (resolved/fixed)
Documentation (3)
101582 Smoke Test: NullPointerException is thrown out when open an existing report design file in which there is grid.(closed/invalid)
101969 Wrong reference in BIRT Developer Guide (closed/fixed)
101977 API document is inconsistent with real implementation (closed/fixed)
Report (7)
87022 Use preservelastmodified in Ant Copy scripts (closed/fixed)
92091 rom.def - allowsUserProperties set to false for Styles, and other entries (resolved/fixed)
101825 Set bold style to grid in property editor and it will be reflected in grid highlight box when you add a highlight rule but will not when you modify it.(closed/fixed)
102496 onRender of Data item isn't executed (resolved/fixed)
102725 DimensionHandle can not parse '1,2in' (resolved/fixed)
103517 Cannot load 'Driver' class (resolved/fixed)
104769 org.eclipse.birt.report.model.metadata.ExtensionException found in meta.log (resolved/fixed)
Report Designer (28)
87803 Data Explorer view doesn't show new data source or data set (resolved/fixed)
87804 Incorrect rendering in BIRT property editor (closed/fixed)
87830 NullPointerException in org.eclipse.birt.report.designer.internal.ui.editors.schematic.ReportDesigner.selectionChanged (resolved/fixed)
88935 Wrong string formatting (upper and lower) (resolved/fixed)
100354 '%datasource.name' is listed in data sources list when create a data source.(closed/fixed)
100964 Provide Support for the Eclipse 3.1 Platform Release (resolved/fixed)
100965 Create a RCP version of BIRT Report Designer (resolved/fixed)
100999 Ctrl+Z.Y doesn't work in expression builder (closed/fixed)
101000 Font is not sorted in order.(closed/fixed)
101586 Exception throw out when new a table group with invalid group field (closed/fixed)
101973 Digit number for ruler displays partially when setting bigger value (closed/fixed)
102598 i18n bug mulitplies numbers by 10 (resolved/fixed)
102713 Undo.Redo can't be refreshed right away after setting hyperlink.(closed/fixed)
102969 Paste should be disabled when nothing is copied (closed/wontfix)
102973 Table group interval shouldn't change after preview (closed/fixed)
103126 hyperlink content in property editor can't be cleared (closed/fixed)
103158 NPE throw out when click on edit group in cheat sheet when delete table group (closed/fixed)
103171 edit the dynamic text won't restore last expression to expression builder (closed/invalid)
103526 New Data Set dialog box has red square on right side (resolved/fixed)
103788 Display inconsistantly in BIRT GUI (closed/fixed)
103962 RCP:Project icon can not displayed (closed/wontfix)
104184 The button in Dataset.Filters can not work (closed/fixed)
104307 when group on a double type field, set interval less than zero should be permitted (closed/fixed)
104617 In chinese testing environment, translation need to be improved.(closed/fixed)
104623 Highlight preview doesn't work when change two highlight rules order.(closed/fixed)
104764 Acceptance Test: New Report repeatly produces same name of file (closed/fixed)
101403 Smoke Test: Property editor view doesn't work.(closed/fixed)
101407 NullPointerException when selecting Save As in top menu (closed/fixed)
Report Engine (14)
96357 Projects contain errors when opened in Eclipse (resolved/worksforme)
101361 Bugs in org.eclipse.birt.report.engine.extension.internal.ExtensionManager (resolved/fixed)
101685 Unable to use the Report Item Extension framework, when no query exists (resolved/fixed)
101751 Enhance IImagehandler interface to allow full customization of birt image handling mechanism (resolved/fixed)
103050 PDF Hyperlinks (resolved/fixed)
103106 Generates incompatible FOP files (resolved/fixed)
103120 Hyperlink file can't be retrived when click it in PDF page (closed/invalid)
103169 Format number with Symbol prefix should display right size when preview in Linux (closed/wontfix)
103449 Log BIRT extension loading details information (resolved/fixed)
103622 Inline for two grids doesn't work in layout view and pdf preview.(closed/duplicate)
104172 Blank pages will be generated when set Page Break to always.left.right.(closed/invalid)
104239 Value-Of Problem (resolved/fixed)
104457 Set table drop to all, preview does not take effect.(closed/worksforme)
104629 Generating report in custom plugin cause exception fop.xml malformed URL (resolved/fixed)
Report Viewer (5)
Birt Report Pojo Data Source Example
Tumblr media Tumblr media
100596 DateTime parameters not functioning as report parameters (resolved/invalid)
104177 Spaces in parameter value which is entered are trimmed when previewed in html.(closed/wontfix)
104462 There is a parameter in a parameter group, 'show report parameters' button is always grayed out when previewed.(closed/fixed)
104759 Image imported from a related path in file system can't be previewed.(closed/invalid)
104962 Smoke Test: Data can't be displayed when previewed if data source type is 'sample datasource' or 'JDBC Data Source' except 'JDBCODBC driver'.(closed/fixed)
Test Suite (1)
100968 Provide Daily Build Test Reports on eclipse.org.birt Web Site (closed/fixed)
In a previous blog post I created a skeleton class for rendering a report using BIRT runtime. You can pass it the report parameters, the report definition (rptdesign) and an OutputStream and it will render HTML to that stream.
If your report definition contains graphs we run into a problem. Normally, in HTML an image is a separate resource. BIRT will generate the images containing your graphs in a temporary directory and will link to them in your HTML. For this to work, you will have to configure the Platform to write the images to a publicly accessible directory and write the links using the correct domains. Furthermore, you’ll probably need some process to clean up the images after the reports have been viewed. If your reports are being used in some website and generated on the fly, this is most likely quite difficult to determine. Maybe when the user logs out?
Luckily, in modern browsers we can embed the images in the same stream, bypassing the need of a temporary directory. The following trick will encode the image with base64 and embed it directly into the HTML stream using css data . This will work on most modern browsers but of course Internet Explorer lags a bit behind. PNG support is available up until 32kb in Internet Explorer 8 and SVG not at all. Internet Explorer 9 works fine, as well as the other major browsers.
So how does it work? First, we explicitly tell the render engine to use PNG or SVG. SVG provides sharper images but will not work in Internet Explorer 8 as mentioned above. Next, we inject our own HTMLServerImageHandler which encodes the image data to base64.
Birt Pojo Data Source Example
2
4
6
8
10
12
14
16
18
20
22
24
privateHTMLRenderOption getHtmlRenderOptions(OutputStream outs)(
HTMLRenderOption options=newHTMLRenderOption();
options.setSupportedImageFormats('SVG');
options.setSupportedImageFormats('PNG');
setupImageHandler(options);
options.setOutputFormat('html');
)
privatevoidsetupImageHandler(finalHTMLRenderOption options)(
options.setImageHandler(newHTMLServerImageHandler()(
protectedStringhandleImage(IImage image,Objectcontext,Stringprefix,booleanneedMap)(
StringembeddedImage=Base64.encode(image.getImageData(),false);
return'data:'+image.getMimeType()+';base64,'+embeddedImage;
));
Birt Pojo Data Source Examples
Some references:
Tumblr media
1 note · View note
victorlimadelta · 4 years
Text
Pidge is actually trying to take this a tiny bit seriously. Last night, while she was working on moving her work station into the makeshift pharmaceutical laboratory she’s set up for herself over the last few months, she was putting together a presentation, like it’s a business pitch or a grant funding exercise. Still, it’s easier to illustrate her point when she has diagrams to go off of. The fancy little holograms from her PADD can even be manipulated in real-time in three-dimensional space, for added cool factor.
It also means she can keep her thoughts together as she goes through the theoretical aspects of this with @swordsedge Ulaz. Before she begins, she takes a shot glass, fills it with the Olkari root extract she’s come to love so much, and knocks it back like it’s so much liquor. That should keep her going for the next eight or so hours and stave off the fluorescent-inspired headache she’s guaranteed to get if she works down here too long. She offers some to the Galra in front of her, but he declines. Reasonable. He doesn’t know what it is, and she could have tampered with it, so she’s not offended.
They’d had a brief conversation last night, as well, about how to structure this upcoming week. Pidge had asked Ulaz what the Galra Empire would do for someone who had a genetic degenerative disease. The answer, unsurprisingly, was a mercy cull. For an empire driven by expansion at all costs, a disabled life is not one that can be afforded. Ulaz did show the correct amount of disgust as he explained, at least, which reassured Pidge that he was here for the right reasons, to do the right thing. What wasn’t so reassuring was that he hadn’t actually encountered this specific problem before, as a medical officer.
Tilting her PADD against her empty glass so the holograms can project onto the table, Pidge launches into her explanation. “so, you understand what we need to do here,” Pidge reminds Ulaz. “this is different from just keeping shiro in stasis and keeping disease from progressing. this is total genomic overhaul.” She flicks the first diagram out from her screen to the table, starts spinning it--a puffy little X shape made of squiggles. “what we’re working with is the x chromosome, a location on the short arm called p21.2-p21.1.” When she zooms in with her fingers, there's a noticeable length difference between the two top arms of the chromosome. “there’s a deletion here--not one of the worst, but not in a good place, either. this codes for dystrophin: the protein that builds human muscle. without it, the muscles we’re born with can’t be effectively re-built when they’re damaged. usually, you’d have a backup on the other matching pair in your chromosome set, so your body could just use the one that works and ignore the one that doesn’t, but shiro can’t do that, because he doesn’t have a second x chromosome, he has a y chromosome. which, don’t tell it i said this, but it’s pretty useless, aside from sry. poor little thing. smallest in the human genome.”
This is probably stuff Ulaz already knows. Based on what Pidge surmises about Galra, just from pure conjecture surrounding the fact of Keith’s existence, they also must have a similarly-based biology, with double-helix DNA, ACGT pairs, X/Y sex chromosomes, even the same number and arrangement of chromosomes. Otherwise, Galra wouldn’t be able to reproduce with humans, or proliferate so far with so many other alien races. Still, it helps to start from the common denominator and build up to more complex premises.
Pidge pinches her fingers together, then spreads them to zoom in on her DNA diagram--to the portion that’s missing. “there’s maintaining the dystrophin shiro still has, and there’s teaching his body how to make it for himself. two different things. he already had weakness in his legs, to be expected, but now you’re telling me he’s having trouble breathing. that means his diaphragm can’t repair itself. he’s too weak to work his own lungs. that’s... that’s advanced. the only way it could be worse is if it was in his heart, and we don't know that it's not. so, we can’t just plug this with pharmaceutical intervention. giving him the actual dystrophin protein isn’t, by itself, going to get him where he needs to go. he needs to do it for himself, and he needs to be able to rebuild what’s been lost on top of it. that means...”
Another diagram flicks next to the first. This one's the clip of what's missing. “i have to get this, here, but... everywhere. as far as you're telling me, this is something the galra weren’t even interested in devoting resources to. it’s something humans haven’t quite been able to achieve, even with crispr, our most advanced gene splicing engineering technology.  altean alchemy isn’t suited to this, and i can’t see that they've ever attempted a genetic cure, just an amino acid replacement. the olkari seem to find it anathema to attempt it, even with their advanced biohacking abilities. but i’m--we’re not dealing with just one set of medicine. we’re not limited here. i can use all of this accumulated knowledge and make something bigger than the sum of its parts. i just need to run this by you, theoretically speaking, to see if it’s even possible in practice.”
Dismissing the first diagram to focus on the second, she twists her two hands, pulls them apart, and it zooms in on the individual molecules making up the DNA helix: red adenine paired with green thymine, yellow cytosine paired with blue guanine, clumped in threes (that’s a slight liberty with the illustration, but it works for these purposes). “coran’s taught me how to use this lab to make pharmaceutical compounds i thought would be impossible with the materials we have. apparently all you have to do is ask these atoms and molecules nicely to create their bonds. so far i’ve been... moderately successful in using it.” That’s false modesty. Pidge has been able to synthesize a full medication line for Shiro by now, from advanced corticosteroids to muscle relaxers, from gene-targeted therapies to painkillers. “but, i mean, dna is just a bunch of molecules, when you get down to it. huge, snarled-together molecules, but molecules all the same. the backbone of the helix is the same. the a, c, g, t are the same. if i can teach the lab to make the individual components, it’s just an issue of putting the building blocks in the right order and making them stick together. that part, actually making the gene i need, that’s the part i have the most confidence in. i know i can do it. what i don’t know is how much time it’s going to take, or if i can accelerate it by redirecting non-essential ship power to this one resource. and i won’t know for sure until i get started on it. but, the good news is, i know what i need to make and how i need to make it. easy.” Relatively speaking, of course.
The next image Pidge pulls up is entirely new. “this--this part’s more complicated. this little device is crispr. technically it’s a repeating genome sequence that humans synthesized from a bacteria, but you can use it for genome modification. depending on what kind of rna you attach to it, you can use it to snip out genes entirely, or cut and paste from one mis-transposed location to another. notice i didn’t say insert. it needs to get the material from somewhere to insert it in the first place, and creating the right sequence out of nothing was always a little too difficult to stabilize in human trials. plus, there were ethical concerns with using it on stem cell lines. no such worries here. if i use altean alchemy to create the missing piece, and if i use the right rna to point it at xp21.2 through .1, it should plunk it right into place. and there’s no medico-ethical dilemma present for doing this with a full-grown person, like there would be if we were trying to fix it in a zygote. it doesn’t even generate the should-we argument. now, getting the rna to target the right location, and getting the delivery mechanism to be stable, and getting it to lock into place, that’ll be a little more difficult.”
What flashes into the set of images Pidge is using, this time, is a series of ones and zeroes. “that's where the olkari technology comes in. their tiaras use human brainwaves, sent as binary code, to modify messenger rna, to redirect plants on what genes they should be expressing at any given time. it unlocks a gene’s potential. this should be the key to not only targeting the right location for the gene insertion, but also in making sure that it’s getting used correctly to code for dystrophin. the question you’re probably about to ask is, how does this work with dna when dna isn’t written in binary? but it’s not about reading it, it’s about finding it. rna will read it for itself, pull the correct amino acids, and make dystrophin. cells are pretty smart that way.”
Dismissing all those prior symbols, Pidge finally pulls up a diagram of the human body. “so, congratulations. using a series of increasingly unstable chemical reactions pulling from the most advanced medicine, science, engineering, and coding from three different starfaring species, we created, spliced in, and activated exactly one copy of the dystrophin-coding gene, into one cell.” The hologram zooms in to some generic muscle strand of the forearm. “that cell could die before undergoing mitosis. even if it survives, that’s no guarantee that the new, fixed genome will propagate very far, even within the same physical location of the body.” A red flash, indicating failure.
“but, if i’m understanding your research correctly, there’s something you can do with filtered quintessence to not just make it stick around, but to get it to actually change the whole body genome. this is the part that i’m the most skeptical about,” just in case Ulaz couldn't tell from her tone. “i don’t know how quintessence works at the best of times. as far as i care, though, if it does what you say it will, then it can be literal space magic--as long as it works by a set of fixed principles. if you’re saying we can wash out the old genome and, i guess, dye the new one into place by steeping shiro in enough quintessence, it’s worth a try.”
Presentation over. Pidge collapses her diagrams, puts her PADD face-down on the table. When she catches Ulaz’s face, his expression is unreadable. Just like always, really. “so, after all that, i have two questions for you. one, does that sound like something we can, theoretically, even do? i don’t want to waste time or energy on research if it’s not going to pan out in real life. and even if it does, question number two, how much quintessence would it actually take to do something like that? are we talking on the level of a d-cell battery, car battery, aircraft engine, starship-class balmera crystal, the type of energy it would take to hold strand in stasis for eons--what do we need, and can we actually get it?”
7 notes · View notes
sorcierarchy · 5 years
Text
Tarot Tips for New Readers
(I wrote this last year in response to an Ask, figured I would repost it since it comes up often!)
First of all (and this might be late if you’ve already purchased a deck, but for anyone reading this who is looking for their first deck), get a deck that is fairly detailed. There are some incredibly beautiful minimalist decks, but personally I find a detailed and figurative (not abstract) deck is best to start with. Even better if you have people with different faces, expressions, positions, etc.
Tip 1: Don’t memorize the cards, and don’t read the book. Obviously you can read the book at some point, but my recommendation is not to start there. Infer your own meanings from the cards first, and then complete them with the book as needed. I’ll get into more detail on this further down.
Tip 2: Keep a tarot journal. Or use your grimoire to keep notes if you don’t mind having them all in the same spot, but definitely you want some kind of reference for yourself. This is about keeping your initial impressions on the cards, but also how your impressions evolve over the course of your time with the deck. Even beyond reading the book and making your original inferences, I can guarantee your ideas on what means what will change over time (and that’s not a bad thing). Tarot is like learning a language and as your understanding of the language evolves, your original notes of “x means y” will change and become more nuanced.
Tip 3: Learn the characteristics first, and piece them together to form the meanings of individual cards. Think of the language example again. In the English language, about 300 words make up roughly 65% of all written works. There are over 170 000 words currently used (not counting multiple meanings, different dialects, etc), so that’s less than 1% of words needed to understand most of written English text. This principle can be transposed, to an extent, to most languages… and that includes tarot. My point is: it’s not about learning all the words right off the bat, but about learning the right ones. This is continued in tips 4 through 8.
Tip 4: Learn the Major Arcana last. These cards represent moments of clarity or change. They are normally big events, decisions, etc, and mark a very clear break between “before” and “after”. I know this is the exact opposite of what most people recommend when it comes to tarot, but personally I find the Minor Arcana cards to be a lot more useful for day to day stuff, and the best way to learn tarot is to use it often for small questions that don’t necessarily require Major Arcana cards. If you start with only the Major Arcana cards and ask “should I do my dishes today” and get the Death card it’s kinda like…. super fucking dramatic, you know? I enjoy being as extra as the next guy, but sometimes it’s like powering off your PC with a sledgehammer... there’s a better way to do it.
Tip 5: The suits. Learn the suits as categories by assigning meaning to them on a personal level. If you’re working with a more traditional deck with swords, cups, pentacles, and wands: what do each of these elements mean to you? Go beyond the obvious and get into the details, the backstory, whatever comes to mind. Write down the key words and brainstorm on each of them. For example, when I see swords I think of something more cerebral and intellectual, a decision or action that requires aim and consideration or planning. I also think of the people who tend to hold swords and how they use them (can be used defensively or as an attack). I think about knights, and the sense of duty that a sword can represent. Et cetera for each suit.
Tip 6: The numbers. You want to go over the same as the above, but with each number. What does the number two represent for you? What ideas does it conjure? What are the up sides and down sides of it? Another example, the number four reminds me of stability, like a chair with four legs, but it also reminds me of being restrained or walled in. Stability can be good, but can also lead to feeling trapped... and so on. 
Tip 7: Court cards. Court cards are normally representative of a person, personality type or archetype, or a cluster of characteristics (physical, emotional, or psychological). Personally, I don’t take these cards literally as far as gender, hair colour, etc go (but that’s up to you). For example, a knight with a “male” looking figure doesn’t necessarily mean an adult man. Take hints from the other elements in the card to discern what type of person it could be depicting, and write down your impressions. Again, try to think of the up sides and down sides of this personality type.
Tip 8: Describe the card and put it all together. And I really do mean “describe the card” on a literal level. What is happening in the card? What are the characters doing, if there are any? What’s the story? Think about how the visual action goes with the characteristics you’ve gained from the suit and number in the card. If there are any other symbols or colours that have personal meaning to you, take those into account as well (for example, if the colour blue is dominant and you associate it with calm).
Tip 9: Context clues. Again with the language example, but use your context clues. What position is the card in? What was the question asked? How does this card relate to the other cards around it? Pulling together from our previous examples: I have the four of swords in the “what I should do” position of a spread, and the question is regarding finances. The card depicts a person with their head resting on the hilt of two swords, asleep and looking peaceful. The colour palette is predominantly blue. I might interpret this card as saying to rest on my intellect in regards to this matter. If figures on separate cards are looking forward, backward, at another card... these should all be taken into account. The way I describe it when I’m giving coaching on tarot reading is to see the cards like panels in a comic strip. They individually have meaning, but they need to be read in context of each other. 
Tip 10: The infamous reversed card. In movies inverted cards always seem to be accompanied with a loud gasp and wide eyes, but this is also commonly  called the “advice” position. Meaning, this card is not so much a prediction as it is a suggestion on what you should be doing or improving, or a warning. Use your context clues to determine which of these is most applicable.
Practice! Practice practice practice!! This is probably the most obvious thing in the world, but I can’t say it enough. You can spend all day staring at lists of vocabulary and memorizing, but nothing will help you learn faster than going out into the world and using it. Use tarot for ridiculous little things, and big things, and just whenever! Take notes on how you interpreted the card at the time, and then go back to your notes and see where you were right and where you had misread. You’ll be absolutely amazed how often the answer was in the cards and you just didn’t see it. I highly recommend picking a card in the morning, writing down your interpretation, and then comparing it to how your day actually went. See where your interpretation was correct, and where you need to adjust.
As a final note if you would like to learn more about tarot I HIGHLY recommend checking out biddytarot.com
Honestly, there is nothing I could possibly say that Bridget hasn’t already said (and likely a lot more eloquently). She has an amazing set of articles, a fantastic podcast (and a lovely voice), and on top of all that there is a forum where you can do readings for people and get feedback from them. She even has tips and suggestions on how to read tarot professionally. I’m not affiliated with her in any way, but if I had heard her podcast when I first started reading tarot I would have learned it about five years faster than I did.
As always, my inbox is open if anyone has questions about tarot! 
1K notes · View notes
Text
“You’re going to make this awkward, aren’t you?” - Roger x fem Reader (smut)
Summary: It’s 1983 and Queen are recording their next album at the studio where you work. Things get a little heated between you and their drummer.
Part 1 | Part 2 | Part 3
In this “episode”: Things start to get a bit more serious than Reader would like during a night alone with Roger at the studio.
Word count: ~4.8k
Warning: age gap (Reader is 21, he’s 35), language, and smut, so 18+ please
Tagging: @fixedonroger @a19103 @ginabaker1666 @thickthighsandbasicbrowneyes @culturefiendtrashqueen @imaginesandideas
(Let me know if you want to be tagged!)
[A/N: Thank you for voting! This concept was thanks to 17384 anon suggestions. 🥁 And you’ll even get a little math lesson with Brian in this one.]
Tumblr media
One month and 3 days, you think to yourself as you sit at your desk and look through the window into the studio, watching Roger sitting at the soundboard with headphones on, deeply concentrating on whatever it is he’s listening to. Brian’s sitting next to him scribbling something and John’s fiddling around with his bass. You’re not paying them any attention. Your eyes are fixed on Roger, and you can’t turn them away, no matter how hard you try. Stop it, you fuss at yourself. Turn away right now and do the stupid bookwork before Kevin comes back tomorrow and fires you because you’re too busy daydreaming. You force yourself to turn around and focus on the task in front of you. Three months of bank statements he couldn’t be bothered to balance out, and he just threw it at you, expecting you to fix everything. “This isn’t what I signed up for,” you mumble, not realizing you had company.
“Well you didn’t sign up for that either, but you sure are doing it,” Jack laughs, making you glare up at him and seeing him point at the window. “You’ve got yourself a big distraction now, don’t you?” You keep glaring at him, wishing he would just shut up. “Usually they’re the ones distracted by you,” he laughs as he sits on your desk. “Remember when Kevin gave you those two weeks off when…”
“Shut up, Jack,” you say, completely frustrated and not wanting to recount the time that fucking creep from that band you don’t even want to think about didn’t know how to keep his hands or his filthy thoughts to himself. “I have a ton of work to do and you’re not helping.”
“Cheer up, kid,” he says as he stands up and pats you on the shoulder. “You’ll get it done.” He walks back into the studio and taps on the window to get your attention. When you look back, he’s got a big, cheesy grin on his face as he waves and closes the blinds, making sure you don’t get distracted again.
It didn’t help. Not in the slightest. Roger was in your head and he wouldn’t leave. You keep reflecting on everything that happened the past few weeks. Of course, sex happened – but it wasn’t just that anymore. You were enjoying each other’s company without the sex. He’d come over and you’d watch a movie, and you’d fall asleep on the sofa, and he’d cover you with the blanket and let you sleep, sometimes even staying there until he fell asleep too. Or you’d go to the Santa Monica Pier, just to get out and have a fun time somewhere he was able to blend. One night you stayed out on your balcony until almost 3 in the morning doing nothing more than sharing silly childhood stories. It didn’t complicate anything but your feelings, and that wasn’t supposed to happen. Feelings weren’t ever supposed to be involved, but it was hard for them not to be now. Not now that he knew your birthday, and you knew each other’s favorite colors, and you knew about things he did to cause trouble when he was 13, and he knew your sister’s name… Things casual hookups don’t know about each other. He’s picked up on your habit of pushing your hair behind your ear when you’re deep in thought, just like you picked up on his nervous habit of rubbing his hand on his shoulder to calm himself down.
You’re almost finished with half of the book balancing finally, an hour after you started, when Freddie and that creep manager of his who was like his shadow walked in. “Got tired of being gawked at all the time?” he giggled when he saw the window was closed. “Or did you get tired of doing the gawking?”
You purse your lips into a grin, trying not to giggle back, but you end up giggling anyway. “Hi, Freddie. Glad to see you could make it.”
“Not you too!” he dramatically cries out. “I don’t need another person nagging me.” You smirk and turn your attention back to your work as he walks into the studio.
“Nice to see you, Fred,” you can hear Roger yell out while the door was open. “Hey, who closed the window?” Took him long enough to notice, you grunted to yourself as you rolled your eyes. “That’s blocking my view…” you hear him say as the door closes, which gives you a little grin.
Two hours. For two hours you’ve been working on this stupid bank balance and you don’t think your brain can handle anymore. How it could get so messed up you don’t know, but you couldn’t look at another number right now or you were going to lose it. You slam your head down on the stack of paper that’s laying in front of you, wanting to cry, when the studio door opens and everyone starts walking out, laughing and talking without a care in the world. You don’t even want to look up at them. You’re enjoying your momentary zone out.
“We’re going grab lunch, kiddo,” Jack yells at you. “Wanna come?”
You look up, completely drained. “It’s only lunch time?” you ask, almost whining.
“It’s almost 3:00,” he tells you. “Want me to bring you something?”
You shake your head no and start to look back at the numbers. “I need to figure this out. I can’t get the damn thing right. I hate math.”
Brian calmly walks over and peeks at what you’re doing. You look up at him, confused, and he points at the total you have and the one you should have. “You transposed numbers somewhere when you were adding,” he tells you with a smile. “Add the numbers in this together.” He pushes you the calculator and before you finish he says “they’re all going to add up to 9.” He squats down to get even with you and talks you through it. “Add the 7 and 2 in that 72 together, and you get… 9.” You glare over at him and he’s smiling. “Now add this all up again. Carefully, this time,” he chuckles. And what do you know – it balanced. “Just a trick I leaned some time ago,” he smiled as he stood up. “Now you can come with us.”
Roger’s standing in front of you with his arms crossed, shaking his head in amusement. “Well aren’t you the fucking hero,” he sarcastically quips at Brian before looking at you. “Come,” he tells you, holding out his hand. “I insist.”
Everyone decided on the greasy diner across the street from the studio and crowded into a booth, leaving Jack to sit in a chair on the end. It was an uncomfortable arrangement, but you didn’t complain too much, since this was the most contact you had with Roger all day. He sat with his arm behind you on the seat, not even realizing that he kept rubbing your shoulder. He was watching you closely as you ate the strawberry pancakes you cheekily ordered (and yes, he found it incredibly funny), and when you were leaning over the table to be able to hear John better when he was talking to you, showing you the lyrics he had jotted down for his song they were going to be working on tomorrow.
“Do you think they flow?” he asked you. “I think this sounds stupid but I don’t know. It may work.”
You read the part he was pointing to and smiled. “I think it’s perfect. Simple, to the point, and perfect.”
The discussion you were having with John went on for a little while, just the two of you, until Roger apparently got tired of you ignoring him and cleared his throat while tapping you on your shoulder to get your attention. “You’ve a key, right? For the studio?”
“Yeah?” you say, questioning why he wants to know.
He starts to nudge you out of the booth. “Lets go. Need to get something out of my head.” You hurry and jump up and the two of you head back across the street.
When you open the door, he hurries and locks it back and keeps the shades closed on the windows, grabbing your arm and pulling you back to him. “Hey,” he says with a grin.
“Hey, you,” you grin back. “What’s in your head that you have to get out?”
“My ex,” he started. “She’s coming here. To Los Angeles. For a few days.” You look at him, wondering why he’s so fidgety when he’s telling you this, and why he’s even telling you this in the first place. “I told her she can stay at mine…” Now he’s looking at you wondering why you don’t seem to be bothered by any of this, and his brows start to furrow.
You start to giggle. “Were expecting me to start screaming at you? Become some unhinged lunatic or something?”
“A little bit, yeah,” he smiled. “But you’re too relaxed and I don’t know how to handle it.”
You pull him down by his shirt to get close to you and whisper deeply in his ear. “Does she taste as good as me?”
He leans back and looks you in the eye with an intensity that sends shockwaves through your entire body. “No one tastes as good as you.”
“Then I have nothing to worry about,” you tell him with the same intensity he’s giving you right now, as you reach down to cup his crotch. “And you said this was all mine.”
“It is,” he whispered in your ear as he chuckles deeply and moves his hand to your chest. “You have no idea what I’m going to do to you later.” He started to say more, but he’s interrupted by a knock on the door. You can hear everyone outside talking. You roll your eyes and start to walk away so you can open the door, but he grabs you and holds you back. “You’re all I can think about, Y/N.” You gaze up to him, your mouth slightly open with shock. There’s another knock on the door but you’re ignoring it. “You have a hold over me...” He cups your face in his hands. “I can’t…”
He’s interrupted by a louder knock on the door, followed by Jack yelling. “Open the goddamn door, Y/N! Why is it locked?”
You slide away to open the door and everyone pours in and heads directly to the recording room. “Later,” Roger comes and whispers in your ear before joining them, “we’ll finish this conversation later.”
You were internally a jumbled mess for the rest of the day. Every nerve you had seemed to puddle right there in the pit of your stomach. We’ll finish this conversation later? Your brain wasn’t helping calm you down at all. None of this was ever supposed to get to where you’d have conversations later. This was just supposed to be sex and nothing more. End it. End it, you kept telling yourself. He doesn’t mean anything he says, you try to convince yourself. He just wants to make sure I don’t… Your thoughts are interrupted by a knock on the recording room window.
Jack waves you over and you quietly walk in. “They’re working late and I can’t stay. Kid’s got some school thing. Can you stay to lock up?” You really didn’t want to. You’ve had such an exhausting day already, but you agree to stay.
You ended up falling asleep around 8:00, having been bored for the past hour and giving up all hope for a quiet evening at home in front of the television with a bowl of ice cream. Everyone was frustrated – them in the studio and you out in the office. Their frustration is why you stayed out of there. You weren’t in the mood to listen to any bickering. Freddie woke you up when they were leaving – well, all of them were leaving except for Roger, who was still sitting behind his drum kit trying to work out a beat. “He needs your help with the playback,” John told you. “Think you can handle that?”
“All I have to do is press buttons,” you grinned. “I think I can do that.” When they left, you quietly walked in the studio and sat down at the controls.
He didn’t notice you. He was focused on his drums, so you didn’t disturb him. You could see the annoyance in his face until he looked up and saw you sitting there. “Hey there,” he said through the mic. You smiled and gave a small wave. “Play that tape and listen to this…” And you did. Ten times, ten different cadences, and you recorded them all for him. That was about the extent of your abilities – pressing three buttons. “Come in here,” he said with a sigh.
When you walk in he turns on his stool and holds his arms out, beckoning you for a hug. You smile and go to him, and he wraps his arms around your hips and rests his head against your stomach. “Rough day?” you joke as you run your fingers through his hair. Well this is sweet, you tell yourself. You’re supposed to be ending it, dumbass.
“So frustrating,” he mumbles. “Nothing went right after we stopped for lunch.” He closed his eyes and started chuckling deep in his throat. “That feels nice.”
You lean down and kiss him on top his head. “Just relax,” you mumble. What are you doing, Y/N? you fuss yourself. He squeezes you tighter, and the two of you stay just like this for a couple of minutes before he looks up at you and smiles and puckers his lips, which you gladly lean down and give a quick peck to. You’re such an idiot, Y/N.
He lets go of his hold and pats his lap. “Sit,” he tells you, and you do, and holds your waist as he spins his stool around and grabs two drumsticks. “Ever played before?”
“Oh yeah, all the time,” you say sarcastically. “In fact, I’m so good that when Keith Moon died, The Who wanted me to be his replacement.” He poked at you with one of the sticks and started laughing. “Never even held a drumstick before.”
He hands them to you and you take them. “Well that’s going to change,” he tells you as he adjusts them in your hands. He rests his chin on your shoulder and you start to giggle. “You’re going to make this awkward, aren’t you?”
You can’t stop giggling as he holds your hands and starts moving them to hit the drum heads while he glides them around. You aren’t paying much attention. His foot stomping on the bass drum pedal is, well… creating a vibration. “I’m sorry,” you tell him, still giggling. “I’m not a very good student. My teacher is quite a distraction.”
He stops, and he’s giggling now too. “Well it’s hard to be a good teacher when my student is also a distraction.” He rubs his hands up your arms before wrapping himself around you and squeezing you gently into him. He clears his throat, his chin still resting on your shoulder. “I need you to let me all the way in, Y/N.” You turned your head quickly and looked at him, totally confused. “What are you feeling? Right now. Right this second. What are you feeling?”
“Nervous,” you whisper and you both start to giggle.
“That’s not what I mean, silly,” he says as he pokes you jokingly. “I mean, what are you feeling about… us?” You turn your head to him quickly again, looking confused again, and your mouth opens but no words are coming out. “I feel like there’s more to this.” Still, no words can come out of your mouth. “I feel like you’re scared…” You hold a hand up to his mouth, hoping he’ll stop, but instead he moves your hand away. “It’s okay,” he whispers, “because I’m scared too.” For the first time in your life, you’re completely speechless. “But I’m not willing to let that stop me.”
You look up to the ceiling, understanding everything he’s trying to tell you right now, trying to find words. You seem to have forgotten all words. “Roger,” you sigh. “Please…”
“Don’t tell me not to feel, Y/N,” he whispers. “I can’t do that.” He holds two fingers under your chin and turns you to face him. “Look at me,” he whispers, and you do. And you completely surrender. “I know you feel it. You can’t tell me you don’t.” Your heart is beating so fast you’re scared it’s going to burst. Your breathing is getting heavier and you’re still silent, wanting to tell him so bad how you feel – how you don’t want to feel – but you can’t. You can’t say anything. You’re lost in his gaze, his eyes controlling everything right now. He nudges your hips, implying he wants you to stand up, so you do, and turn to face him, still holding the drumsticks in your hands. He glances up at you and gives you that damn smile that makes you turn into putty.
He unbuttons your shirt, not taking it completely off, leaving it opened. He starts to softly kiss you on your stomach as he runs his hands over your hips and thighs, undoing your pants and slowly pulling them down. “These are new,” he tells your with a smirk, running his fingers over your panties, before reaching to grab one of the drumsticks you’re still holding from your hand, tossing it back and hitting something rather loudly. He stops caressing you and takes the other one from you, and he starts to toss it back as well, but he looks at you with somewhat of a devilish grin. He starts to rub the tip of it along your body, across your stomach, down to your pelvis, then your upper thigh. He stands up as he continues to brush your skin, bringing it to the front of your covered mound, giving you another smirk as he grazes it over you right there. “What are you doing?” you ask him with a nervous giggle.
“Playing,” he whispers with a smile as he leans in for a kiss, giggling with you. He moves the stick away, tossing it behind him like he did the other one and laughing. “Won’t be needing that,” he quips. “I can take it from here.” This isn’t lust he’s exuding right now – this is passion. Absolute passion that you’ve never experienced before. Not with him or anybody else. And you loved it, and you couldn’t stop yourself from giving him the same.
You start to unbutton his shirt – that same shirt he was wearing your first time together, and it brought back a wave of memories that washed over you. He didn’t interfere with your undressing him, opting instead to stand there and smile at you, letting you slowly move his shirt off of his shoulders before running your hands down his chest to his waist. You were slipping off your shoes and stepping out of your pants as you unbuckled his belt. He brings a hand under your chin and you look up at him, and for the first time in all of the times you were with him, you felt the need to take it slow, to drink in every single second. He was feeling it too. If he wasn’t, you’d have already been completely naked and halfway to climaxing.
He slipped out of his shoes as you finished unzipping his pants and pushed them down off of his waist, moving yourself down as you glided them completely off. It was quiet – almost eerily quiet. The only sounds were your breathing. You reach up and pull down his boxers, your mouth perfectly even with his cock. You say nothing. He says nothing. You bring your tongue to the tip of his member, slowly and softly licking him clean of the precum that has graced the head, before gliding down underneath the shaft as you hold it up. You feel him start to twitch, letting you know that you were doing everything exactly how you needed to be. When you lower your mouth to completely engulf him, he lets out a small moan and puts a hand on your head, not to guide you, but because he needed to touch you. He let you set the pace, giving up all control, something he had never done before. You start to hum quietly, sending gentle vibrations that he felt though his entire body. He didn’t want to finish – not yet. Not until he knew you were completely satisfied. But fuck, the magic you were creating with your mouth and your tongue felt so good, and you were enjoying it, too. Every twitch, every soft moan, every deep breath you heard him take was getting you more and more aroused.
He takes a small step backwards, and when you look up at him, he’s softly smirking and wiggling a finger for you to stand up and meet him. You slowly stand, and when you do, he gently holds your face in his palms, drawing you in for a soft but deep kiss. There are still no words being spoken. No words need to be spoken. Your eyes and actions are doing all of the talking right now. He finishes taking off your shirt and reaches behind, unclasping your bra, guiding it slowly off of you, just as you had done with his shirt. He bends down and takes your nipple in his mouth, his other hand gently rubbing the other, as he flicks his tongue. Your head falls back, and this time you’re the one with the quiet moans. He raises up again to meet you, and gently guides you back, leaning you on his drum kit. He kisses you deeply again before slowly kissing his way down your neck, that valley right between your breasts, down your stomach and to right above your panties that he admired earlier. He runs his fingers under the waistband and pulls them down, his breathing getting heavier as he revels at the perfection he sees before him.
You feel his breath against your thigh before he lowers his mouth to continue kissing his way to his ultimate destination. With every touch his lips make on you, you flutter. Still, no words are spoken, only actions. When his mouth finally makes its way to your lips, you inhale deeply and your body jerks, causing one of the cymbals to crash down. But you don’t care, and neither does he. His concentration on your pleasure is undisturbed. The warmth of his breath on you, the gentleness of his mouth and his tongue working it’s magic on you causes your breathing to get deeper, quicker. You run a hand through his hair, massaging his scalp with your fingertips. He moves a hand that was resting on your thighs and pulls his head back so he can watch his fingers rub up and down your wetness as he hears your gentle moans getting louder. He looks up at you, his mouth agape in complete awe of you, and he sees you looking down at him, telling him everything he needs to know without saying a word. He watches your eyes as he steadily slides two fingers in, taking great care in making sure you’re able to feel every single motion he’s creating inside of you. He turns his tongue attention back to your flower, focusing on your clit, tenderly guiding it to emphasize your pleasure, his eyes never leaving yours.
“Oh, Roger,” you whisper, finally breaking the silence that had befallen the two of you 20 minutes before. There’s so much you want to tell him – how you love the way he’s making you feel right now, how you love the way he’s lapping up your juices right now, how you love the way his fingers are moving inside you right now – but you’re rendered speechless again, your breath caught deep in your throat. He knows this is what you want. He knows without asking. He can see it on your face that you’re enjoying it. He doesn’t want to stop, not until you’re ready for him to stop. He starts to suck softly on your clit, still slowly moving his fingers in and out of you, knowing how he’s making you feel because your hand has now started to grip his hair. He can feel you tighten and start to tremble with pleasure. He sees you adjusting yourself against his drums – those drums he will never look at the same again – as you get closer to reaching your climax. His eyes never leave yours. Your moaning and breathing both increase, and he doesn’t stop. He keeps licking, keeps sucking, keeps pumping his fingers in and out of you until you finish. Even then, he doesn’t want to stop. He meant it when he said that he loves the way you taste, so even when you finished, he licked up your slit one more time before he stood up and held out his hands for you to grab.
He pulls you up from his drum kit and draws you in for a kiss and you can taste yourself on his lips. He holds you close to him as he turns, walking you back toward the wall. He kicks a guitar out of the way – you don’t know if it’s Brian’s or John’s, nor do you even care – because it was in the way. He leans against the wall with one hand, still kissing you, still dancing his tongue around yours in your mouth, still exuding that passion. Your hands rest against his chest as you break the kiss and look up at him. Your eyes never leave each other.
He pulls one of your legs up and wraps it around his waist, leaving your other foot on the ground before crouching down and pushing himself into you as he grabs your ass and hoists you up. You stayed there, pinned between him and the wall, as he thrust himself hard, but slowly, into you. You wrapped your arms around his neck, running your fingers through his hair, staring deeply into each other’s eyes as you groaned in lockstep with each other, breathing heavy, even starting to sweat at the same time. Still, no words needed to be spoken. You felt everything – you felt his cock hitting your g-spot at the perfect angle and beat, the way his fingers dig into you with every lunge he makes. He feels everything too – every single piece of hair that your hands were grabbing, every grip your walls made around his cock, every single moan that came from your mouth.
You were ready. He was ready. And neither one of you could hold out any longer. Wave after wave of pleasure flowed through your entire bodies, and you pulled his head into yours so you could kiss him as you shared your orgasms. You feel every single drop of his cum burst inside of you. You held his kiss, you didn’t want to let it go, and he didn’t want to let you go either. But you had to let go. You couldn’t stay like this forever.
He pulled himself out of you, slowly, and lowered your leg back down. “You are breathtaking, Y/N,” he whispers. “Where have you been all my life?”
You can’t resist making a joke. You want to resist but you can’t. “Well, for almost half of it I wasn’t even born yet,” you say with a cheeky grin, making him laugh and kiss you again. “Stay with me tonight,” you tell him. It wasn’t a question.
He nods and smiles. “I’d like that,” he whispers.
As you lay in bed, your head on his chest and his arm around you, you didn’t allow yourself to overthink all of this. You didn’t want to analyze anything, you didn’t want to worry about anything. You just wanted to lay there and soak everything in and savor this while you could. You love this, but you hate it all the same, but before your brain delves too deep into the catastrophic end to this fling with Roger that you envisioned, you divert your attention to the night you just had and fall asleep in his arms.
[part 5>>]
334 notes · View notes
foxgarage2 · 2 years
Text
Reflection Pdf Mulan
Christina Aguilera: Reflection (Pop Version) (from Mulan) for clarinet solo, intermediate clarinet sheet music. High-Quality and Interactive, transposable in any key, play along. Includes an High-Quality PDF file to download instantly. Licensed to Virtual Sheet Music® by Hal Leonard® publishing company. How to get osu on mac. Bisoncam nb pro driver windows 10. Mulan’s friends. Mushu is a small fire dragon. He is one of Mulan’s family guardians. Fa Li, Mulan’s mother, and Grandmother Fa want Mulan to find a good husband. Fa Zhou is Mulan’s father. He was injured fighting for the Emperor in the past. Fa Mulan is a young woman who prefers riding her horse to a more traditional female role. Print and download Reflection sheet music from Mulan arranged for Flute. Instrumental Solo in F Major.
But her debut single – ‘Reflection’ for the original 1998 Mulan soundtrack, which led to her first record deal being signed – holds a special place in her hear Christina Aguilera has experienced many exciting firsts in her long and successful career. Download Reflection From Walt Disneys Mulan sheet music PDF that you can try for free. We give you 2 pages notes partial preview, in order to continue read the entire Reflection From Walt Disneys Mulan sheet music you need to signup, download music sheet notes in pdf format also available for offline reading. How to insert a checkbox in word 2008 for mac.
Mulan . Confucius ( K o ng F uz i 孔夫子) w a s b o rn d uring th e Spring a nd A utumn era o f th e Do ng Z h o u d y na s ty . T h o s e w ere times o f rupture a nd d ra s tic ch a ng es fo r th e a ncient king d o m o f C h ina . Wa rl o rd s b a ttl ed a g a ins t ea ch o th er w ith a rmies o f h und red s o f th o us a nd men, …s eeking to o v er th ro w l o ca l rul ers , ta ke o v er th eir l a nd s , a nd th eir po s itio n o f ro y a l s o v ereig ns . B ut, in th e mid s t o f w a r a nd ch a o s … a r a nd ch a o s , Cwonfucius s ea rch ed fo r pea ce a nd o rd er. H e s tud ied h o w w is e king s o f th e pa s t rul ed pea ceful d y na s ties . Also, how Ch ina ’s firs t a nces to rs l iv ed in h a rmo ny w ith th e o rd er o f ian H ea(vT en 天) . He w a nted to rev iv e th e va l ues a nd tra d itio ns o f a ncient d y na s ties . So , h e b eca me a n ed uca to r a nd a n a d v is o r to l o ca l rul ers . F uture a d o pted d y na s ties Confucius’ tea ch ing s a s th eir principl es a nd w ere a b l e to res to re th e pea ce. M a ny b o o ks w ere w ritten co nta ining a nd pres er v ing th es e principl es : Ren 仁 H uma nity , kind nes s , g entl enes s . “If one sets one’s heart on ren, there will be none he hates” (Analects 4.4). Li 禮 Rig h t b eh a v io r: Ritua l s . “To delight in li and music improves you” (Analects 16.5). Li 禮 Ritua l : Wo rs h ip th e A nces to rs . “He who offends against Tian has none to whom he can pray” (Analects 3.13). Li 禮 Ritua l : G o v ernment o b l ig a tio ns . “Provide a leading example to your officers” (Analects 13.2). Li 禮 Ritua l : D is cipl ines a t h o me. “When abroad serve superiors and ministers, when at home serve elders” (Analects 9.16). Li 禮 Ritua l : Ceremo nies . “Fulfill your office eagerly, perform your duties with loyalty” (Analects 12.14). Y ong 勇 V a l o r. “The wise are not confused, the valorous are not fearful” (Analects 9.29). Yi 義 M o ra l intentio n. “If a person has valor but lacks yi, he becomes a bandit” (Analects 17.23). Z hong 忠 Lo y a l ty . “If the lord directs his minister with li, the minister will serve his lord with zhong” (Analects 3.19). X in 信 Trus tw o r th y : pro mis e keeper. “Take zhong and xin as the pivot and have no friends who are not like yourself in this” (Analects 1.8). X iao 孝 F il ia l Piety . “One pupil asked about xiao and Confucius said, Never disobey” (Analects 2.5). J ing 敬 Res pect. “Confucius said, xiao and jing, are the roots of ren” (Analects 1.2). S hu 恕 R ecipro city : T h e G o l d en Rul e. “That which you do not desire, do not do to others” (Analects 15.24). De 德 V ir tue, po w er, ch a ra cter. “He who exercises government by means of his de may be compared to the North Star, which keeps its place and all the stars turn towards it” (Analects 2.1). J unzi 君子 G entl ema n: ma n o f perfect v ir tue. “The junzi takes yi as his basic substance; he puts it into practice with li, uses zhong to enact it and ren to complete it” (Analects 15.18). D ao 道 W a y , meth o d , principl e, mes s a g e. “In the morning hear the dao, in the evening die content” (Analects 4.8). W u l un 五倫 F iv e Ca rd ina l s : W u l un 五倫 1. S ub ject to Rul er. W u l un 五倫 2. S o n to F a th er. W u l un 五倫 3. Y o ung er b ro th er to O l d er b ro th er. W u l un 五倫 4. W ife to H us b a nd . “Women and small men are difficult to nurture. If you get too close to them, they become stubborn, and if you stay too distant, they become resentful” (Analects 17.25). W u l un 五倫 5. F riend to F riend . T h es e principl es a b o ut s o cia l b eh a v io r a nd mo ra l th inking a re kno w n a s th e centra l d o g ma o f th e rel ig io n: T he “p en dan t fo r b al an ce ”.. i s t he Yin Yang s y m b o l : O p p o s i t es i n l i fe co m p l em en t an d dep en d o n each o t her fo r b al an ce an d harm o n y . Yin Yang Female Male Passive Negative Night Cold Active Positive Day Warmth A l s o s ho wn i n M ul a n an d Sh a ng ’s ho rs es . Shang or S han g S hu i s t he n am e o f o n e o f t he b o o k s i n t he Fi v e C l as s i cs of C o n f u ci an i s m . Li m e a ns: Ritual. Zhou i s t he n am e o f t he dy n as t y du ri n g whi ch C o n f u ci u s l i v ed. If y o u i n t erchan g e t he s y l l ab l es o f C o n f u ci u s 's m i ddl e n am e i n C hi n es e: Fuzi y o u g et t he ho m o p ho n e: Chi Fu . Yao was a k i n g du ri n g an ci en t dy n as t i es , who C o n f u ci u s reg arded as wi s e an d virtuous. Shan m ean s “t o u s u rp ”. Yu was t he l eg en dary em p ero r fo u n ded t he X i a dy n as t y . who Wu Wu was t he em p ero r who fo u n ded t he Z ho u dy n as t y . Zhong m ean s l o y al t y . References Elsten, David. “Beyond The Five Relationships: Teachers And Worthies In Early Chinese Thought.” Philosophy East & West 62.3 (2012): 375-91. Web. 27 Nov. 2014. Eno, R. “The Analects of Confucius: An online teaching translation.” Indiana University. N.p., 2012. Web. 27 Nov. 2014. Moyers, Bill. “Confucianism.” Films On Demand Digital Educational Video. Films Media Group, 6 Jan. 2007. Web. 27 Nov. 2014. Mulan. Dir. Tony Bancroft and Barry Cook. Perf. Ming-Na Wen and Eddie Murphy. Walt Disney Feature Animation, 1998. Netflix. Netflix. Web. 27 Nov. 2014. Richter, Kent, Eva Räpple, John Modschiedler, and R. Peterson. Understanding Religion in a Global Society. 1st ed. Belmont: Wadsworth, 2005. Print. Taylor, Rodney L. The Illustrated Encyclopedia of Confucianism. New York: The Rosen Publishing Group, 2005. Print. Theobald, Ulrich. “Chinese Thought and Philosophy: Confucius and Confucianism.” Chinaknowledge. N.p., 27 July 2010. Web. 27 Nov. 2014. VEA. “Confucius in Ancient China.” Films On Demand Digital Educational Video. Films Media Group, 20 Mar. 2014. Web. 27 Nov. 2014. Cindy Marquina 1998 Disney All Rights Reserved
Reflection Pdf Mulan Full
Tumblr media
Popular Right Now
Reflection Pdf Mulan Theme
Writers & Publishers
from the album Disney Princess: Fairy Tale Songs ·Copyright: Writer(s): Duke Ellington Lyrics Terms of Use
Mulan Reflection Chords
Last.fm's Current Most Loved Pop Tracks
Look at me, I may never pass for a perfect bride, or a perfect daughter. Can it be, I'm not meant to play this part? Now I see, that if I were truly to be myself, I would break my family's heart.
Reflection Mulan Piano
Related
Who is that girl I see, staring straight back at me? Why is my reflection someone I don't know? Somehow I cannot hide? Who I am, though I've tried. When will my reflection show, who I am, inside? How I pray, that a time will come, I can free myself, from their expectations On that day, I'll discover someway to be myself, and to make my family proud. They want a docile lamb, No-one knows who I am. Must there be a secret me, I'm forced to hide? Must I pretend that I am someone else for all time? When will my reflection show, who I am inside? When will my reflection show, who I am inside?
Reflection Pdf Mulan Summary
Check Out
0 notes
totaltozier · 7 years
Text
Last Chance - Finn x Reader
Request: Hey heyyyy can you do a Finn x reader imagine where you see each other every day at school and like each other but don’t admit it until prom when you’re the only people without a date (slightly like the snow ball scene !!) I don’t send requests often so sorry if it sounds weird xx
Note: So I loved the idea of this request and I was super excited to write it! Its kind or more like the day before prom and you and Finn don’t have dates yet and everyone else does and ya! I hope you like it!
PLOT: It’s the day before prom and coincidentally you and Finn both don’t have dates!
WORD COUNT: 1478
“Don’t forget to buy your prom tickets before it’s too late! Only one more day until the big night! Tickets are on sale, this period in the cafeteria! Last chance!”
You were walking to lunch with Sadie and Millie as you heard the announcement over the loud speaker in the hall.
“I can’t wait for prom!” Millie squealed. “Noah asked me to go with him last night!”
“Oh my god! That’s so exciting! You two are so cute!” Sadie said. “Y/N, are you gonna ask Finn to go with you to the prom?” she asked you. You had a major crush on Finn but only Sadie and Millie knew. He was part of your friend group and the two of you were great friends but you were too scared to ever tell him.
“Nah, he would probably want to go with someone else anyways” you said, brushing off her question.
“You never know!” Millie said. “He might actually like you back but since you’re both too scared to admit it, you’ll never find out.”
The three of you walked into the cafeteria and scanned around to find the rest of your friends. After a second, you saw Gaten jump up from his spot and wave his arms in the air to get your attention. You waved back and made your way over to the table. Caleb and Finn were seated with him and you slid into the seat next to Finn.
“I don’t know man, I still don’t understand the concept of transposing from the key of D Sharp to B Flat major” Caleb complained to Finn.
“C’mon, it’s simple stuff, you just change this note here, flatten these two notes, change this and bam! B Flat major!” Finn moved the sheet music back towards Caleb who just sighed and put his head in his hands, defeated by the music theory.
“I should have just taken drama; this music theory is too hard” Caleb said.
You pulled out your lunch from your bag and opened a carton of raspberries. You pushed the container towards Finn beside you. “Want some?” you asked.
Finn reached down and picked up a few before popping them in his mouth. “Thanks Y/N!” Finn smiled. You were always happy to share your raspberries with him since they were both your favourite fruit.
“So, um, Sadie” Caleb started, he had put his homework away by now. “Are you uh, going to prom?”
Sadie tucked her long red hair behind her ear. “Yeah I am. Are you?”
Caleb blushed and nervously scratched his head. “Uh, yes I am. I was um, wondering if you would want to go with me, like as my date, sort of thing, possibly?” he asked.
Sadie looked over to Millie who was giving her thumbs up, then back at Caleb. “Um, yeah that would be great, I’d love to!” She answered.
“Awesome great! What colour is your dress?” Caleb asked. The two continued their conversation, discussing colours and flowers and where they were going to take pictures together.
Finn stood up from his seat and grabbed his backpack. “I’ve got to go to my locker,” he looked down at you. “You wanna come with?”
You nodded your head and quickly packed up your stuff. The two of you made your way out of the cafeteria and towards his locker on the other side of the school.
“So, Finn. Are you going to prom?” You asked.
“Yeah, I am, are you?” he looked over as you both kept walking.
“It’s cool how Sadie and Caleb are going together and Noah and Millie too, I feel like I’m the only one without a date” you said, taking a deep breath before asking your next question. “Are you going with anyone, like as a date?” You were looking at the floor now, afraid of what his answer might be. You had liked Finn for so long, it would crush your soul if you found out that he was going with another girl.
“No not yet,” he answered. “I mean, there is someone I want to go with but I haven’t gotten the courage to ask her yet. She probably wouldn’t say yes anyways.”
You looked up at Finn in shock. “Come on, what girl wouldn’t want to go to prom with Finn Wolfhard? There’s girls dying to just talk to you in the hallway” you said,
“No way” Finn claimed. “I’m just Finn. Plain old Finn.”
“Well, I think you should ask her” You suggested. “Maybe buy her some nice flowers, possibly a teddy bear too and show up at her door and ask her like a gentleman would. There’s no way she could say no to that!”
“You really think so?” Finn asked.
“Well I know that I would definitely say yes if a boy did that for me” you said, hoping that he couldn’t tell that you wished it was your door that he’d be showing up to with flowers and a proposal for prom.
The period was almost over and Finn was quickly switching out his textbooks from his locker for the next two periods.
“Anyways, are you taking the bus home today? I’ll save you a seat” you said. Finn and you shared the same bus route since you lived only a block away from each other.
“Nah, not today, I’ve got somewhere to go after school” Finn answered. The bell rang loudly through the hall, signalling for the next class to start in three minutes. “I’ve gotta go! See you later Y/N!”
“See ya, Finn!” You watched Finn run down the hall towards his geography class for a moment before turning around and heading towards calculus. You sat down at your desk and pulled out your notebook and pens. The teacher started his lesson but you couldn’t help but wonder who Finn was going to ask to prom. Whoever it was though, you were already wishing it was you instead of them.
You got off the bus and were walking home as you passed Finn’s house on your way to your own. Once inside, you made your way into the kitchen to make a start on the pile of homework you were assigned throughout the day. You were halfway through a terribly boring history reading when the doorbell rang. No one else was home so you got up to answer it. The door swung open to show Finn standing on your porch, both his hands behind his back hiding something.
“Finn?” you were lost as to why he was at your door.
“Hey Y/N!” he had a smile on his face that spread from ear to ear. “May I come in?” He asked.
“Um, sure” you opened the door all the way to let Finn pass before closing it behind him. “What are you doing here?”
“Well I came to bring you these!” He brought his right hand out from behind his back. He had a bright bouquet of flowers in his hand full of yellows, oranges, and pinks.
You were at a loss for words. “What? Why?”
“Oh, and this little guy here” he brought around his left hand which had a light brown rabbit with floppy ears and a yellow bow around his neck and handed him to you.
“Finn, what’s happening?” You asked, accepting the bunny but still completely confused.
A blush crept up on his cheeks and Finn ran a hand through his hair. “Um, well, basically there’s this girl who I want to go to prom with me so badly and it would mean the world to me if she said yes so I actually asked her today how I should prom-pose to her and this is what she told me to do so,” you suddenly realised what was happening and it was all happening so fast, “I couldn’t find a bear so I thought a bunny would work too. Y/N, will you go to prom with me?” Finn asked as he held out the bouquet of flowers to you.
You couldn’t help but smile like an idiot and you could feel the blush rushing to your face. You nodded your head big and said yes. Finn quickly pulled you in for a hug and you wrapped your arms around him as he hugged you tight.
“I’m so glad you said yes” he said, “I’ve been thinking about going to prom with you for years.”
You pulled out of the hug as you heard him say that. “Years?”
“Yeah, I’ve sort of liked you for a really long time, Y/N.” Finn admitted.
You smiled. “That’s sort of funny because I’ve liked you for a really long time too, Finn.”
“It’s a good thing we’re going to prom together then, eh?” Finn joked.
“There’s no one else I’d rather go with” you gushed before quickly reaching on your tiptoes to kiss Finn on the cheek.
366 notes · View notes
sorcierarchy · 6 years
Note
hey!! I'm getting into tarot - I have a deck and a book that came with it on meanings of the cards, but that's it. I feel super lost and I have no idea where to start - do you have any pointers? I heard recently that tarot is a language unto itself, and it feels kinda overwhelming. Thanks!
Yes! I fucking love tarot, and it’s honestly not anywhere near as complicated as it’s made out to be. You can learn to read tarot effectively and confidently within a really short amount of time if you approach it the right way. 
First of all (and this might be late as you’ve already purchased a deck, but for anyone reading this who is looking for their first deck), get a deck that is fairly detailed. There are some incredibly beautiful minimalist decks, but personally I find a detailed and figurative (not abstract) deck is best to start with. 
Tip 1: Don’t memorize the cards, and don’t read the book. Obviously you can read the book at some point, but my recommendation is not to start there. Infer your own meanings from the cards first, and then complete them with the book as needed. I’ll get into more detail on this further down.
Tip 2: Keep a tarot journal. Or use your grimoire to keep notes if you don’t mind having them all in the same spot, but definitely you want some kind of reference for yourself. This is about keeping your initial impressions on the cards, but also how your impressions evolve over the course of your time with the deck. Even beyond reading the book and making your original inferences, I can guarantee your ideas on what means what will change over time (and that’s not a bad thing). As you said, tarot is like learning a language and as your understanding of the language evolves, your original notes of “x means y” will change and become more nuanced. 
Tip 3: Learn the characteristics first, and piece them together to form the meanings of individual cards. Think of the language example again. In the English language, about 300 words make up roughly 65% of all written works. There are over 170 000 words currently used (not counting multiple meanings, different dialects, etc), so that’s less than 1% of words needed to understand most of written English text. This principle can be transposed, to an extent, to most languages... and that includes tarot. My point is: it’s not about learning all the words right off the bat, but about learning the right ones. This is continued in tips 4 through 8.
Tip 4: Learn the Major Arcana last. These cards represent moments of clarity or change. They are normally big events, decisions, etc, and mark a very clear break between “before” and “after”. I know this is the exact opposite of what most people recommend when it comes to tarot, but personally I find the Minor Arcana cards to be a lot more useful for day to day stuff, and the best way to learn tarot is to use it often for small questions that don’t necessarily require Major Arcana cards. If you start with only the Major Arcana cards and ask “should I do my dishes today” and get the Death card it’s kinda like.... super fucking dramatic, you know? I enjoy being as extra as the next guy, but sometimes it’s like powering off your PC with a sledgehammer.
Tip 5: The suits. Learn the suits as categories by assigning meaning to them on a personal level. If you’re working with a more traditional deck with swords, cups, pentacles, and wands: what do each of these elements mean to you? Go beyond the obvious and get into the details, the backstory, whatever comes to mind. Write down the key words and brainstorm on each of them. For example, when I see swords I think of something more cerebral and intellectual, a decision or action that requires aim and consideration or planning. I also think of the people who tend to hold swords and how they use them (can be used defensively or as an attack). I think about knights, and the sense of duty that a sword can represent. Et cetera for each type.
Tip 6: The numbers. You want to go over the same as the above, but with each number. What does the number two represent for you? What ideas does it conjure? What are the up sides and down sides of it? Another example, the number four reminds me of stability, like a chair with four legs, but it also reminds me of being restrained or walled in. Stability can be good, but can also lead to feeling trapped.
Tip 7: Court cards. Court cards are normally representative of a person, personality type or archetype, or a cluster of characteristics (physical, emotional, or psychological). Personally, I don’t take these cards literally as far as gender, hair colour, etc go (but that’s up to you). For example, a knight with a “male” looking figure doesn’t necessarily mean an adult man. Take hints from the other elements in the card to discern what type of person it could be depicting.
Tip 8: Describe the card and put it all together. And I really do mean “describe the card” on a literal level. What is happening in the card? What are the characters doing, if there are any? What’s the story? Think about how the visual action goes with the characteristics you’ve gained from the suit and number in the card. If there are any other symbols or colours that have personal meaning to you, take those into account as well (for example, if the colour blue is dominant and you associate it with calm). 
Tip 9: Context clues. Again with the language example, but use your context clues. What position is the card in? What was the question asked? How does this card relate to the other cards around it? Pulling together from our previous examples: I have the four of swords in the “what I should do” position of a spread, and the question is regarding finances. The card depicts a person with their head resting on the hilt of two swords, asleep and looking peaceful. The colour palette is predominantly blue. I might interpret this card as saying to rest on my intellect in regards to this matter.
Tip 10: The infamous reversed card. In movies inverted cards always seem to be accompanied with a loud gasp and wide eyes, but this is more commonly  called the “advice” position. Meaning, this card is not so much a prediction as it is a suggestion on what you should be doing or improving, or a warning. Use your context clues to determine which of these is most applicable. 
Practice! Practice practice practice!!This is probably the most obvious thing in the world, but I can’t say it enough. You can spend all day staring at lists of vocabulary and memorizing, but nothing will help you learn faster than going out into the world and using it. Use tarot for ridiculous little things, and big things, and just whenever! Take notes on how you interpreted the card at the time, and then go back to your notes and see where you were right and where you had misread. You’ll be absolutely amazed how often the answer was in the cards and you just didn’t see it. 
I’m sorry this is so long and I hate to make this longer, but as a final note if you would like to learn more about tarot I HIGHLY recommend checking out biddytarot.com
Honestly, there is nothing I could possibly say that Bridget hasn’t already said (and likely a lot more eloquently). She has an amazing set of articles, a fantastic podcast (and a lovely voice), and on top of all that there is a forum where you can do readings for people and get feedback from them. She even has tips and suggestions on how to read tarot professionally. I’m not affiliated with her in any way, but if I had heard her podcast when I first started reading tarot I would have learned it about five years faster than I did. 
I hope this helps! Have fun learning  tarot :) 
244 notes · View notes
viditure · 4 years
Text
Data Science: How do you use sequential data to refine navigational analysis?
When it comes to analysing behavioural web analytics data, analysts often work with aggregates: visit duration, number of page views, bounce rate… These variables make it possible to describe each navigation sequence in a very structured way. They are easy to represent for analysis purposes, or to use in a machine learning environment for example. 
In this article, we provide suggestions for processing navigation data at a somewhat more refined, less aggregated scale, and with new information. The aim is to exploit the sequences of events constituting the visits in a quantitative and fully automated way. More specifically, we look at page sequences over the course of visits, bearing in mind that the approach is transposable to sequences of events, products, etc. 
The first step is to define the objectives of sequence data processing and then to outline a procedure for the automatic extraction of sequences of interest. 
Objectives: identify and exploit sub-sequences of interest 
As Data Scientists at AT Internet, our role is to design solutions that automate the extraction of information and assist our users in their analyses. Since our data model allows us to define objectives (financial or not) on websites, we decided to use it as a variable of interest. In this context, we aim to: 
Identify navigation sub-sequences in relation to the conversion rate and highlight these relationships.  
Annotate the presence of such sub-sequences in the customer journeys and use them as predictive variables in a machine learning model. 
For example: a website records a conversion rate of 3%, but we find that visits to pages A, B and then C generate a conversion rate of 14%. The sequence A>B>C has an associated conversion rate higher than the average conversion rate. It can therefore be considered as a sub-sequence of interest. 
Data sets and modelling 
Identifying data sets 
First of all, we need to give some context to our navigation sequence data: 
Each sequence is annotated as successful or unsuccessful. 
We work with sequences of pages visited, for example : [Home > Product1 > Conditions of sale] (the sequences are deliberately reduced when the conversion pages are reached in order not to introduce a bias). 
In the end, you get a dataset like this one: 
user sequence conversion 123 Homepage > Product1 > Privacy > Contact false 456 Product 1 > Product 2 > Product 1 > Return Policy true 789 Product 2 false  
Such data sets are often imbalanced. The group that does not convert is often in the majority: they frequently represent more than 95% of visitors. This is a problem frequently encountered in the machine learning process (conversion, fraud detection, churn, …) and which could be the subject of an entire article by itself! 
Modelling and evaluation of sequences of interest 
What are the objectives? 
A sequence of interest is defined by two parameters :   
Its presence must affect the conversion rate (positively or negatively). 
It must be sufficiently present for its impact to be significant across the dataset. 
Quantitative evaluation 
In science, it is best practice to reason quantitatively, so we’ll start by focusing on a metric that will be used to evaluate potential candidate navigation sequences. In this way, we will be able to classify them and retain only the most useful ones. 
In information theory, there is a metric that quantifies uncertainty: entropy. If we consider Y∈{0,1} a random variable indicating whether there has been a conversion, the entropy H(Y) makes it possible to quantify how random this event is on a scale of 0 to 1. In the case of a binary variable, the entropy is simply defined as follows: 
H(Y)=−(P(Y=0)log(P(Y=0))+P(Y=1)log(P(Y=1))) 
(1) 
or P(Y=1) which means: “the probability that Y is equal to 1”, i.e. the probability that a conversion will take place. We can estimate this probability by the observed conversion rate. 
A simple graph helps to interpret this: 
Tumblr media
The further the probability of the outcome deviates from equilibrium (0.5), the less uncertainty there is about the outcome (so far, logical!). 
What interests us is the impact of the presence of any sequence of pages (e.g. A>B>C) on the realisation of Y. By noting X the presence of A>B>C in the sequence (1 if present, 0 if not), we can define the conditional entropy: 
H(Y|X)=P(X=1) H(Y|X=1) + P(X=0) H(Y|X=0) 
(2) 
or H(Y|X=1) which means: “the entropy of Y, when X is 1”, i.e. the uncertainty that reigns over the act of conversion, when we know that the sequence A>B>C is included in the sequence of pages visited. 
Conditional entropy now allows us to quantify the uncertainty on the outcome of Y when the outcome of X is known. 
We have now brought together the two ingredients needed to calculate mutual information: 
I(X,Y)=H(Y)−H(Y|X) 
(3) 
We therefore read that I(X,Y) is the difference between the uncertainty of the outcome of Y (will the user convert?) and the uncertainty of Y when X is known (will the user convert knowing that they have carried out the sequence A>B>C?). So, this is precisely the information that X carries about Y. 
It is important to note that in addition to meeting our need for information extraction, mutual information also depends on the frequency of appearance of sequences (see 2) and the conversion rate when they are present or not (see 1 and 2), as desired in the previous section. 
Information theory was founded by Claude Shannon in the late 1940s and was initially applied in the field of telecommunications. Today it is present in many algorithms used for artificial intelligence applications. Its foundations are presented here in a very simplified way, but a solid mathematical theory guarantees the concepts presented. This video (in French with subtitles) details the origins of this exciting field. 
Candidate sequences 
Now that we are fully equipped to classify sequences by utility, we need to define a strategy for generating candidate sequences. The search space is indeed rapidly becoming vast: if we take a website with 15 pages and consider only navigation sequences of 20 pages at most, we have to test more than 1520≈3.1023 different sequences, which is about the number of stars in the observable universe – quite a few… 
The choice of our generation of candidate sequences is guided by two needs:   
The generation must be based on simple algorithms, leaving room for future iterations and being able to be presented to less technical stakeholders. 
It must use algorithms available in big data frameworks, in order to guarantee its generalisation to all AT Internet clients, whose data volumes vary considerably. 
For these two reasons we have chosen the PrefixSpan algorithm, which can be explained in a few lines with a schema and whose implementation is available in the Apache Spark MLlib. 
PrefixSpan is a sequential pattern mining algorithm: in our sequence dataset, it is about identifying the most frequent sub-sequences. The algorithm has two parameters: 
The minimum support: the minimum frequency of a sub-sequence through the dataset to be considered frequent. 
The maximum length of the sub-sequences searched for, in order to limit exploration. 
PrefixSpan guarantees the extraction of all frequent sub-sequences (in the sense of minimum support) – for this purpose it explores the tree of possible sub-sequences. The diagram below is an example with pages named a, b, c, …, z. A path from the root of the tree to a node therefore represents a sub-sequence of visited pages. 
Tumblr media
As mentioned above, it is unthinkable to explore this tree entirely. This is where PrefixSpan comes into play – by reducing the cost of exploration using two strategies: 
Pruning of the tree by the anti-monotonicity rule: if the A>B>C sub-sequence does not reach the frequency threshold, any sequence of which it is the prefix will have a lower frequency. For example A>B>C>D will necessarily be less frequent than A>B>C, since the second is included in the first. There is therefore no need to explore the sequences for which A>B>C is the prefix, so we can prune all the branches that follow and reduce the calculation. 
The projection of candidate post-fixes: at each step, only the post-fixes (sequences following the prefix) still candidate for the calculation are kept. For example, if we have just processed the sequence A>B>C, we have eliminated at the first stage all the sub-sequences not beginning with A, then all those not beginning with A>B, then all those not beginning with A>B>C: quickly the set of sequences to be tested diminishes. 
These are the broad outlines of the algorithm, but you can find more detailed information in the original article. 
Summing up 
We have set ourselves an objective: to identify informative sub-sequences on a conversion act. 
We have quantified the notion of information subsequence using mutual information. 
We have a routine for generating candidate sequences, with the guarantee that they are frequent: PrefixSpan. This step requires a minimum frequency of appearance and a possible maximum sequence length. 
The process of extracting informative sequences consists of generating candidate sequences, then accepting or rejecting them according to the criterion of mutual information, or to a lesser extent, ordering them according to the amount of information they provide. 
Analytics benefits 
From a descriptive point of view 
This work makes it possible to develop an advanced sequence exploration product to complement the navigation module. An analyst could filter the sequences according to a minimal presence and analyse the effect of such sequences on the conversion rate. Alternatively, they could keep only the sequences associated with the highest / lowest conversion rates and take actions to promote / avoid such sequences. Finally, by setting conversion and frequency thresholds, any AT Internet user could quickly identify which are the key sequences of their visitors. 
From a predictive point of view 
As part of a project to identify hesitant shoppers, we began by developing a model for learning purchasing behaviour based on the simple metrics we described at the beginning of this article: page views, visit time, number of visits, etc. Then, as the model was iterated, we injected variables into it to indicate the presence of sequences that had been selected as informative using the procedure described above. The addition of these variables allowed us to improve our prediction metrics by 5 to 10% depending on the cases and they were therefore retained in our modelling. 
To go even further 
An informed reader will notice some shortcuts to this presentation. In order to avoid overburdening the technical aspects, a few problems have been left out, such as: 
The imbalance of classes means that user-specific sequences that convert carry little weight in the calculation of PrefixSpan support. For example, if the conversion rate is only 1% and a discriminating sequence is only present in the buyer group, its frequency will only weigh for at most 1% in the calculation of the minimum support (which was more like 20% in our uses, for a quality/quantity compromise of the sequences) and the sequence will be filtered. It is advisable to carry out the extraction treatment separately for the two groups (navigation sequences with and without conversion).  
As sequences are extracted using a univariate criterion (only one sequence is considered at a time), it is common for similar sequences or sequences bearing the same information to be extracted. Post-processing by looking for co-occurrences may be necessary to reduce information redundancy. 
Finally, a paper co-authored by Amazon and Siemens suggests a sequence extraction procedure that takes on board the notion of selection during sequence generation (based on a different algorithm: BIDE), which makes it possible to prune the tree of possible sequences all the more efficiently. This approach seems promising, but it was not initially explored because of its implementation cost. 
Article Data Science: How do you use sequential data to refine navigational analysis? first appeared on Digital Analytics Blog.
from Digital Analytics Blog https://ift.tt/32QoPot via IFTTT
0 notes
Epilogue (La La Land)
Tumblr media Tumblr media
Disclaimer: I do not own anything except for the story itself. Please do not copy or credit this as your own. Photos above are not mine.
Pairing: Altaïr Ibn-La’Ahad x reader
Words: 1270
Warning: none
Tagging: @writingsofawaywardnerd @thepandadrawer @bunnyyumyum @amarabliss @rooks-and-blighters @kebeo @romancingthecreed @freedomaboveallelse @imakemyownblog @thatonepieceofpaper @fortunefavoredthebrave @greyhood99 @scarlet-marionette @thehalodiaries @an-order-of-fryes @outofbluecomesgreen
A/N: Fun fact, I wrote this before Mia and Sebastian’s Theme and then I realized ‘Huh I should probably write that piece before I publish this one.’. So here it is, in its finished form.
You first meet him in a restaurant. A small quaint New York styled cafe in the middle of your favorite neighborhood, perfect for a quick bite.
You swear it wasn’t deliberate when you turned around too fast and spilled a good portion of your tea on another customer’s shirt and this random stranger gets up and grabs as many napkins as he could possibly can and hands them to you. You don’t even dare to explain away your clumsiness, all because of the slow piano music playing in the background that oddly reminded you of your favorite jazz piece but in a different key.
No matter how many times you apologize to the customer and however many times they say it was alright, you didn’t feel alright. The customer takes their drink from the barista and leaves with a smile, but you don’t share the same smile.
The stranger presses a hand to your shoulder and you look towards him before he looks at you softly.
“They’ll be okay. You didn’t do anything worthy of a lawsuit.” He tells you in a Syrian or Jordanian accent in an attempt to lift your spirits and it works. You laugh to yourself as you wipe away the remnants of tea from the side of your cup.
“Yeah, I suppose that’s true.” You reply as you smile at him.
‘Middle Eastern, tall but good-looking. Dark hair and golden eyes that look like they’ve seen too much.’ You think to yourself as you take a better look at him. You look down at his arm and see a faint scar that looked like it was an old bullet wound. ‘Ex-military?’
“Did you want to go somewhere else? You look like you need it.” He asks you and you manage to snap out of your little daze again.
“Yeah, that sounds great.” You answer as you extend your hand. “I’m Y/N.”
“Altaïr.” The man replies as he shakes your hand firmly. “Come on, I know a place.”
You smile gratefully as the cafe speakers start to play a flowing symphonic orchestra piece just before the door swings shut behind the two of you with a little bell ringing gently in the wind.
You place your hand in Altaïr’s as a trumpet and trombone join a piano in a mini-trio before the rest of the orchestra joins in while the overhead chandelier dims its lights.
“Thank you.” You whisper in Altaïr’s ear and he kisses you on the cheek.
“You’re welcome, Y/N.” He whispers back as he plays with the glittering wedding ring on your finger. You grin at him as you place your head on his shoulder as the orchestra slowly quiets down until you could only hear the wind section playing.
The only thing you could think about as you listened to the music was your wedding and especially the wedding ceremony. Your best friend acting as your maid of honor as she made sure that you looked beautiful on your big day as you fiddled with your hands nervously. You remember Altaïr smiling at the end of the aisle, patiently waiting for you to be escorted to the saint that would bless the two of you before he proclaimed you as a lawfully wedded couple. However, the most important part of the entire ordeal that you remembered was how happy you felt the entire time. It was euphoric and joyful all bundled into one day.
You smile in glee at the memories as the orchestra stops playing and you and Altaïr give a standing ovation with many of the other audience members.
“I love you.” Altaïr murmurs in your ear as the conductor bows for the second time. You reach up and plant a loving kiss on his lips as the applause continues ringing around the concert hall.
You hum a soft lullaby under your breath as you try to soothe Darim back to sleep as soft jazz music plays faintly from the hall. Gentle footsteps slowly make their way towards your son’s nursery as you slowly turn around to see Altaïr leaning against the door frame.
“How is he?” He asks as you carefully hand the baby over.
“Falling asleep as fast as any six-month would.” You reply. “Darim takes after his father.”
“Mmhmm, and he snores like his mother.” Altaïr replies and you give him a deadpan stare. Darim coos and grabs a tiny fist full of his father’s shirt before kicking at Altaïr.
“Sounds like he disagrees with you, hmm?” You smirk as you take Darim back from your husband. Outside, you hear a sharp trumpet solo playing on the old gramophone your grandparents gave you and Darim stretches a tiny arm as he makes a little sound in the back of his throat. “You want to listen to jazz, sweetheart? Come on!”
You bounce your son as he claps his hands against your chest and sit down with him on your lap while Altaïr straightens out Darim’s bib when he sits down next to you. Darim looks at the golden speaker of the gramophone and moves to touch it but your arms hold him back so as to prevent it from falling over.
Eventually, Darim gives up and nestles into your arms as Altaïr brushes a few thin strands of hair away from Darim’s face before he wraps an arm around you and you all fall asleep to the sound of singers vocalizing angelically while your favorite orchestra plays alongside them.Out of the corner of your eye, Darim crawls over to the Christmas tree and tugs on the gold ribbon of a present for your mom and you see Altaïr quickly pick him up before he can unravel the carefully made bow that you spent a good fifteen minutes trying to perfect.
You finish hanging the last icicle on the windowsill only to see your husband blowing raspberries on Darim’s stomach, his laughter surrounding the entire living room.
“Malik and Kadar are picking up my parents tomorrow evening from the airport before they all arrive. When are your parents coming again?” Altaïr asks as he bounces Darim in his arms before setting him down in his playpen.
“A day or two after your parents come. Their flight might get delayed, depending on the weather.” You reply as you place a cube that fell out of the playpen back in.
“That works out nicely. Do you want to see Uncle Malik again, ibni?” You laugh as Darim babbles something incoherent before he lies down on his stomach against one of the toys your dad had given for your son’s first birthday a month ago.
You smile at Darim before turning on one of your Spotify playlists and letting the speakers crackle to life. The first song that plays through the speakers just so happened to be a transposed version of one of your favorite songs from your favorite musical.
“Isn’t that City of Stars?” Altaïr asks as Darim looks curiously at the speaker.
“Yeah.” You laugh as you take a sip of water from your Christmas mug on the coffee table behind you. “The play and the music were amazing.”
“As you have said many times.” Your husband chuckles as he takes your hand and wraps an arm around your waist. He hums the tune as he guides you in a little waltz around the room. Darim giggles while Altaïr spins you around and dips you. Altaïr kisses you as he brings you back up and you drape your arms around his neck while snow begins to fall gently outside with a faint sound of tinkling bells.
30 notes · View notes
Text
Postcards to You - London/Paris
Pairing: Lin-Manuel x Reader
Summary: Your post graduation gift to yourself is a three month backpacking trip through Europe trip. Lin is staying behind because In The Heights is starting to finally take off but you make sure to keep in contact.
Word Count: 1,070ish
Warnings: Cursing, mentions of alcohol.
A/N: A multi-part slow burn, friends to lovers fic that includes college Lin and In The Heights Lin which might be my favorite combination ever. There’s gonna be six parts to this (I think, do not quote me on this if it turns out to be more)
Biggest thanks ever to my partner in crime @l-nmanuel for reading and supplying the beautiful postcard.
@gratitudejoyandsorrow - you know what it is
||Next Chapter _________________________________
Tumblr media
Greetings From Paris!
I know we talked on the phone a few days ago but I miss you already, dork! Being attached at the hip for the past four years makes the world feel much lonelier when you’re not around. How am I gonna survive without you when you’re too busy being a huge Broadway star?
I found a soul brave enough to accompany me down to Dorset to see that KGIII monument that I had told you about. I know you said it was silly but the statue was CRAZY. Almost as lavish and over the top as the man himself. I swear to you, one day I’ll get you to appreciate the American Revolution as much as I do.
You already know how much I love London and since we talked on the phone for a solid two hours about everything that I was wanting to see I’ll spare you the recap (also, if we don’t figure out how to tone it down my entire budget is going to be blown on phone calls with you).
I went back to the Buckingham Palace to take a picture with the guards like you dared me to - it should be coming in a separate envelope with a few others I took for you. I’m gonna blame you for the fact that I didn’t get to see everything that I wanted since I had to double back. But really, three days isn’t nearly a long enough time. What was I thinking? Promise me we’ll find our way back here one day?
I’ve spent one day in Paris so far and it’s gorgeous, you’d love it.
My very first stop was the Pont des Art to put the lock you sent with me on the bridge. I got some funny looks for doing it alone but I didn’t pay much mind to it. I was too busy thinking about how you’d say something totally cliche when we threw the key into the river and how I’d have to threaten to shove you in if you kept it up. I heard about some Wall of Love or something that people also go and do so you haven’t missed your chance to say something ridiculous to me yet.
Tomorrow I’m going to the Louvre, so after I’m done with this letter I’m going to search for that list you made of artists that I need to appreciate. You know, with as many tasks as you’ve given me it’s almost like you and all your bossiness is here with me. I wish you were.
Alright, enough being cheesy. I’m off to dinner!
Au revoir! x
[Y/N]
p.s You better keep me updated on the whole Heights workshop thing. Last I heard you were working on what? Draft number three? Is Tommy a hardass or something? I’ll call you on Wednesday like we planned
Lin ripped open the envelope that had accompanied the postcard, flipping through each breathtaking shot of the British and Parisian architecture. He couldn’t help but grin as he got to the last one. True to your word, you had returned to Buckingham Palace and pulled the most ridiculous face you could, accompanied by an equally ridiculous pose, in a picture you had taken with the stoic guard.
“[Y/N]! I was wondering if you had forgotten about me.” Lin’s enthusiastic voice floated through the phone to you and you scolded yourself for letting a week pass since you had allowed yourself to hear it.
“I could never forget about you, Lin.” you chirped back. “Although, last night I definitely drank enough wine to.”
This drew a chuckle from his side of the line and you were instantly back to pining over him.
“Like impromptu karaoke level drunk?” Lin recalled from your college days and you groaned at the memory - or lack thereof. You were informed about your passionate cover of Ignition the next morning when Lin woke you up with advil and a bottle of water.
“No, I couldn’t possibly sing R Kelly without the smoothest operator I know by my side.” you joked back and it drew a full laugh from him. Years together gave you two an endless amount of inside jokes to refer to at any given point. “How’s drafting coming? Have you written the next Rent yet?”
“God no. You think I can compete with Jonathan Larson?” Lin scoffed and your eyes drifted up to look out the window at the cityscape.
“I know you can.” you insisted and he snorted.
“It’s a work in progress. Maybe I’ll have actually made some by the time you get back.” Lin sighed and you didn’t need to see him to know the weight of the world he had taken upon himself was beginning to make his shoulders slump inwards.
“Hey, you got this. Keep working at it and remember to breathe every once in awhile.” you advised and Lin was silent for a beat.
“I miss you.”
Lin’s muscles protested as he finally stood from his desk after spending the night transposing your advice into lyrics for a new song. You had managed to remain his inspiration even with over three thousand miles separating you from him.
Lin had a little over two hours before his meeting with Tommy, just enough time for a power nap. He shuffled a few papers into a folder so he could easily grab it when he inevitably rushed from his place in two and a half hours from now when a glossy item caught his attention. He uncovered it to see the abandoned stack of photos you had sent him and he couldn’t help but flip through them once more. He paused on one he seemed to have overlook - it was a picture you had taken of yourself, to your best of abilities, with the lock he had sent with you. He could see the bronze metal with your names scribbled in his chicken-scratch handwriting just behind you. Your smile seemed to find its way to your eyes and it made his heart long for you. 
He dug out a thumbtack from a drawer in his desk and pinned your picture to the wall above his desk. He stepped back to admire it’s new home before he scribbled a note on the margin of one of the sheets of music he had been laboring over - buy more thumbtacks.
Tagged: @overcaffeinated-and-underslept @itsjaynebird
85 notes · View notes
jazzworldquest-blog · 6 years
Text
USA: Mali + Tigray - Guitars = The Steady-Grooving African-Infused Jazz-Funk of Molly Tigre
Tumblr media
Molly Tigre set out from Brooklyn to answer one tough question: What if the 70s vibes of the cult Ethiopiques series collided with Saharan desert rock and West African blues, but with no guitar to lead the melodic way?
Molly Tigre’s answer is audible in the quintet’s studio debut Molly Tigre(Very Special Recordings; digital and cassette release date: May 14, 2018). The sound is dark and slinky and mysteriously funky, brazenly open to the peculiar global sonic influences that wash over musicians on the streets of the outer boroughs. The premise sounds like some quirky and academic composition challenge, but the mashup has led to some seriously good music, tracks that explore and question without losing sight of the groove.
“I wanted to bring together some of the music and styles from Northern Mali and certain regions in Ethiopia, like Tigray,” the genesis of the band’s name, notes bassist and co-founder Ezra Gale. “I hear a lot of commonalities between them, like the pentatonic scales that are similar sounding. The fact that the rhythms they are using are based around groups of six. They subdivide that differently but there’s a thread that ties them together. When I started playing the music side by side, I thought it was fascinating and I wanted to mash them together.”
He tossed the idea around with sax player Mitch Marcus, longtime friend who has toured with the likes of Donovan and who was former bandmate in the West Coast Afrobeat/-pop group Aphrodesia. “We both realized we were big fans of that music, and not many musicians were doing anything with that at the time,” says Marcus. “That was what we wanted to try originally.”
Mixing two different sets of styles, timbres, and rhythms from opposite ends of a large continent wasn’t enough, however. Gale and Marcus wanted to shake up the approach to the instrumentation often found in many Afro-inspired, groove-oriented bands. “When we started thinking about these very different styles from two different regions, something else came up. I love the sound of no chords, when sax and bass are the only melody instruments,” Gale explains. “There’s a tradition of this in jazz, as people have done piano-less quartets. You get to imply harmonies without a guitar or piano spelling it out, which makes it open and free. It’s hard to do well and make it sound full.”
Molly Tigre went for it, nonetheless. Marcus and Gale recruited sax and flute player Chris Hiatt (Japonize Elephants), drummer Joey Abba (The Ramones), and percussionist Ibrahima Kolipe Camara (National Dance Company of Guinea, Kakande), with occasional blurts of Farfisa provided by a battered old organ one of their recording engineers dragged out of the trash. “We’ve had chordal players sit in with us live,” remarks Marcus, “but not having the chords spelled out adds this space to the songs that’s really nice.”
Instead of the guitar-guided sound common to both Mali and some Ethiopian groups, Gale and Marcus often look to percussion sounds and ideas for inspiration. “From the start, percussion was a really vital element in our writing,” muses Gale. “We’re not just writing a horn melody and a bass line and, okay, whatever the percussion wants to do over that is fine. I think of it as another line in the composition that’s integral to the performance and has a lot of the range of a piano or guitar.”
Percussion lines and rhythmic hooks sparked tracks like “Hello Bolly,” Marcus’s rollicking, rolling tribute to Bollywood soundtracks but with an Afro-diasporic twist. Gale was also moved by the groove to craft “Slush Fund,” a song he swore was a copycat of a Kenyan James Brown-esque track he would spin at a regular DJ gig. “When I went and listened again,” he laughs, “it was nothing like it, except it was in 6/8.”
Though the pieces on the album were inspired by a somewhat abstract premise, once they get down and dirty, it’s all about the music. The film-noir funk of “Lebanese Blond” pits two melodies against one another, leaving plenty of room for improvisation as they weave in and out. “Ethiofreaks” adds vibes to the mix, a tip of the hat to Ethiopian jazz master and vibe player Mulatu Astatke, for an original take on the Ethiopiques sound. Some tributes are even more direct: Astatke’s 70s gem “Yekermo Sew” keeps its serpentine, modal feel, but winds up with new harmonies. “We ended up accidently reharmonizing it,” says Marcus. “I handed out a chart to the band in a particular key; the alto and tenor sax are in different keys. Chris was playing the wrong thing, for lack of a better word, as he was supposed to transpose his line. But it sounded really good in fourths, so we ran with it.”
Running with that open space left where guitars might be, with that room to stretch out and improvise, means combining untold numbers of influences, the kind of thing New York musicians absorb just from walking down the block, past the bodega, the stoop or car stereo speakers, the singing neighbor, the subway violinist.
“Even if we wanted to make this a tribute to these styles, it would never come out that way. We live here, with so much swirling around,” says Gale. “We’re playing Africa-influenced music, but filtered through these lenses,” he adds, “and we love it because it’s original.”
“When you add improvisation into the mix,” Marcus adds, “you’re going to get something different out. “
Molly Tigre marries the rhythms and melodies of African music– especially the entrancing styles from Ethiopia and Mali- with a uniquely progressive and exploratory jazz sound. Featuring the compositions of saxophonist Mitch Marcus (Donovan, Dave Dreiwitz) and bassist Ezra Gale (Super Hi-Fi, Aphrodesia), the band also features saxophonist Chris Hiatt (Japonize
Elephants), drummer Joe Abba (Dave Douglas, The Ramones, Donny McCaslin) and percussionist Ibrahima Kolipe Camara (National Dance Company of Guinea, Kakande). The New York City based-quintet has been stewing since 2015, but their wide-ranging sound hasn't been captured on tape until now. Their debut album, recorded at Nine Lives Studio in Jersey City and at The Creamery in Brooklyn, features eight ear-popping tracks that range from the imaginary Bollywood-meets-West Africa mash-up "Hello Bolly" to the searing funk of "Lebanese Blond" to the outer space jazz of "Y Knot" to the winking homage to classic 70's Ethiopian groove, "Ethiofreaks," which even comes complete with guest vibraphone  (from Tommy Mattioli) and Farfisa organ.    
Memorably described as “Ethiopian funk drinks a progressive jazz cocktail at Miles Davis’ 1970’s loft party,” Molly Tigre’s polyrhythmic stew is music for the body and mind.
Links
Website
Facebook
Twitter
Contact
Publicist
Ron Kadish
812-339-1195 X 202
via Blogger https://ift.tt/2GWCBKf
0 notes
yoon-kooks · 7 years
Text
Hired by BigHit: Bittersweet- Part 6
Pairing: Yoongi x Reader
Genre: Fluff, Angst
Summary: You’ve always had respect for Min Yoongi’s musical talent. As one of BigHit’s new producers, you now get to experience both the bitter and sweet sides of him.
Parts: 1 // 2 // 3 // 4 // 5 // 6 // 7 // 8 // Epilogue (text) 
“Should I add in this beat here?” You pointed to a small section in your rap track. It was coming along better than you had expected. You were finally starting to get a feel for how to really bring out the hiphop in producing. And you had Yoongi right there next to you for support.
“I don’t know, should you?” He raised an eyebrow.
“I think I should?” You tilted your head as you stared at him, as if the answer was written on his face.
“I think you should trust in yourself more.”
“But I like getting your opinion… It means a lot to me…”
“Well what if I’m not here to help you?”
“I mean, of course I would try my best,” you said. “But I’d wait for you.” Yoongi’s eyes widened.
“Very cute.” He patted your head. Your face burned as you pouted away.
Knock. You turned and saw Bang PD standing in the doorway. “Ah! Good timing. You’re both here.”
“Here for what?” Yoongi was just as clueless as you were.
“I’ve been thinking about this since we first hired Y/N,” Bang PD said. “Your styles would mesh well together. And I heard you guys get along pretty nicely, too.” He paused dramatically.
Your ears perked up to catch what was coming next.
“The two of you are officially set to collaborate on a track for the next album.”
“What?” you and Yoongi said in unison.
“You heard me.” Bang PD raised his eyebrows and smiled. “I’m expecting something good from you two.” He exited the studio before you had the chance to thank him for the opportunity.
You were still shook. It was already a blessing that you had the chance to work with Yoongi, but now your name would be right next to his in Bangtan Sonyeondan’s song credits. That was truly your dream. But you were getting way ahead of yourself. You still had to produce an album-worthy track that the BigHit crew would approve of. So you had to stay focused.
“Have you ever collaborated with someone else?” Yoongi broke your train of thought.
“This’ll be my first time,” you said. But you were sure it wouldn’t be any different from the way you were currently working with Yoongi anyway.
“Well I’ll tell you right now, we won’t agree on everything.” He looked at you. “And we shouldn’t agree on everything.” That worried you a little. You definitely didn’t want to risk your relationship with him by disagreeing on something music-related. But for the sake of producing a quality track, you knew you shouldn’t be afraid to challenge your partner’s decisions every once and a while.
You nodded and gave him a thumbs up to show you were ready to work alongside him as his co-producer.
——-
“I think we should base the track around that melody you were humming in your studio that one morning.” Yoongi pointed out.
“Huh? Which one?” You were so busy with your rap track that you had forgotten about your other works.
“Did you really forget?” Yoongi moved to his piano. “Even I know it.” He played the melody even more beautifully than you remembered.
“Wait, that’s what you were playing the other day when you were just messing around, isn’t it?” You pressed a finger to your lip.
“It is, but I just went off of what you were humming and transposed it to the piano.” He continued to play as your memory slowly came back to you. It was from your second day at BigHit. When you had jumped at the thought of Yoongi hearing your unconfident voice. Just how long had he been listening at the doorway to be able to capture the melody so vividly? Had your melody really been on his mind all this time?
“That was meant for a ballad track though…” You didn’t want to sound too eager, but it really did mean the world to you that he wanted to work with your melody for a Bangtan song.
“Bang PD said our styles would mesh well, didn’t he?” Yoongi starting playing something you definitely had not heard before. Yet, it really did complement your part nicely. “I’m sure we can figure something out.”
“Let’s do it then!” You smiled at him as he moved his fingers across the piano. Just for a moment, you let your eyes focus on his pale pink lips. They were parted slightly and looked very kissable. As he moved his hand up an octave, his eyes caught you staring, but continued on without saying a thing.
When he finally finished, Yoongi left his fingers on the keys and stared down at the piano. You noticed the studio filled with silence.
You sat down next to him and tried to replicate what you had heard him play. “You should teach me that new part. I really liked it.” You elbowed him softly and giggled.
“Y/N.” A chill hit you as he spoke your name.
You stopped giggling and turned to him. “Yeah?”
“Do you like me?”
A/N: I just wanna let y’alls know even I didn’t expect this part to end that way o.o
168 notes · View notes
tak4hir0 · 5 years
Link
Transformers from scratch Transformers are a very exciting family of machine learning architectures. Many good tutorials exist (e.g. [1, 2]) but in the last few years, transformers have mostly become simpler, so that it is now much more straightforward to explain how modern architectures work. This post is an attempt to explain directly how modern transformers work, and why, without some of the historical baggage. I will assume a basic understanding of neural networks and backpropagation. If you'd like to brush up, this lecture will give you the basics of neural networks and this one will explain how these principles are applied in modern deep learning systems. A working knowledge of Pytorch is required to understand the programming examples, but these can also be safely skipped. Self-attention The fundamental operation of any transformer architecture is the self-attention operation. We'll explain where the name "self-attention" comes from later. For now, don't read too much in to it. Self-attention is a sequence-to-sequence operation: a sequence of vectors goes in, and a sequence of vectors comes out. Let's call the input vectors \(\x_1, \x_2, \ldots \x_t\) and the corresponding output vectors \(\y_1, \y_2, \ldots, \y_t\). The vectors all have dimension \(k\). To produce output vector \(\y_\rc{i}\), the self attention operation simply takes a weighted average over all the input vectors $$ \y_\rc{i} = \sum_{\gc{j}} w_{\rc{i}\gc{j}} \x_\gc{j} \p $$ Where the weights sum to one over all \(\gc{j}\). The weight \(w_{\rc{i}\gc{j}}\) is not a parameter, as in a normal neural net, but it is derived from a function over \(\x_\rc{i}\) and \(\x_\gc{j}\). The simplest option for this function is the dot product: $$ w'_{\rc{i}\gc{j}} = {\x_\rc{i}}^T\x_\gc{j} \p $$ This gives us a value anywhere between negative and positive infinity, so we apply a softmax to map the values to \([0, 1]\) and to ensure that they sum to 1 over the whole sequence: $$ w_{\rc{i}\gc{j}} = \frac{\text{exp } w'_{\rc{i}\gc{j}}}{\sum_\gc{j} \text{exp }w'_{\rc{i}\gc{j}}} \p $$ And that's the basic operation of self attention. A visual illustration of basic self-attention. Note that the softmax operation over the weights is not illustrated. A few other ingredients are needed for a complete transformer, which we'll discuss later, but this is the fundamental operation. More importantly, this is the only operation in the whole architecture that propagates information between vectors. Every other operation in the transformer is applied to each vector in the input sequence without interactions between vectors. Understanding why self-attention works Despite its simplicity, it's not immediately obvious why self-attention should work so well. To build up some intuition, let's look first at the standard approach to movie recommendation. Let's say you run a movie rental business and you have some movies, and some users, and you would like to recommend movies to your users that they are likely to enjoy. One way to go about this, is to create manual features for your movies, such as how much romance there is in the movie, and how much action, and then to design corresponding features for your users: how much they enjoy romantic movies and how much they enjoy action-based movies. If you did this, the dot product between the two feature vectors would give you a score for how well the attributes of the movie match what the user enjoys. If the signs of a feature match for the user and the movie—the movie is romantic and the user loves romance or the movie is unromantic and the user hates romance—then the the resulting dot product gets a positive term for that feature. If the signs don't match—the movie is romantic and the user hates romance or vice versa—the corresponding term is negative. Furthermore, the magnitudes of the features indicate how much the feature should contribute to the total score: a movie may be a little romantic, but not in a noticeable way, or a user may simply prefer no romance, but be largely ambivalent. Of course, gathering such features is not practical. Annotating a database of millions of movies is very costly, and annotating users with their likes and dislikes is pretty much impossible. What happens instead is that we make the movie features and user features parameters of the model. We then ask users for a small number of movies that they like and we optimize the user features and movie features so that their dot product matches the known likes. Even though we don't tell the model what any of the features should mean, in practice, it turns out that after training the features do actually reflect meaningful semantics about the movie content. The first two learned features from a basic matrix factorization model. The model had no access to any information about the content of the movies, only which users liked the. Note that movies are arranged from low-brow to high-brow horizontally, and from mainstream to quirky vertically. From [4]. See this lecture for more details on recommender systems. For now, this suffices as an explanation of how the dot product helps us to represent objects and their relations.This is the basic principle at work in the self-attention. Let's say we are faced with a sequence of words. To apply self-attention, we simply assign each word \(\bc{t}\) in our vocabulary an embedding vector \(\v_\bc{t}\) (the values of which we'll learn). This is what's known as an embedding layer in sequence modeling. It turns the word sequence $$\bc{\text{the}}, \bc{\text{cat}}, \bc{\text{walks}}, \bc{\text{on}}, \bc{\text{the}}, \bc{\text{street}}$$ into the vector sequence $$\v_\bc{\text{the}}, \v_\bc{\text{cat}}, \v_\bc{\text{walks}}, \v_\bc{\text{on}}, \v_\bc{\text{the}}, \v_\bc{\text{street}} \p $$ If we feed this sequence into a self-attention layer, the output is another sequence of vectors $$\y_\bc{\text{the}}, \y_\bc{\text{cat}}, \y_\bc{\text{walks}}, \y_\bc{\text{on}}, \y_\bc{\text{the}}, \y_\bc{\text{street}} $$ where \(\y_\bc{\text{cat}}\) is a weighted sum over all the embedding vectors in the first sequence, weighted by their (normalized) dot-product with \(\v_\bc{\text{cat}}\). Since we are learning what the values in \(\v_\bc{t}\) should be, how "related" two words are is entirely determined by the task. In most cases, the definite article the is not very relevant to the interpretation of the other words in the sentence; therefore, we will likely end up with an embedding \(\v_\bc{\text{the}}\) that has a low or negative dot product with all other words. On the other hand, to interpret what walks means in this sentence, it's very helpful to work out who is doing the walking. This is likely expressed by a noun, so for nouns like cat and verbs like walks, we will likely learn embeddings \(\v_\bc{\text{cat}}\) and \(\v_\bc{\text{walks}}\) that have a high, positive dot product together. This is the basic intuition behind self-attention. The dot product expresses how related two vectors in the input sequence are, with "related" defined by the learning task, and the output vectors are weighted sums over the whole input sequence, with the weights determined by these dot products. Before we move on, it's worthwhile to note the following properties, which are unusual for a sequence-to-sequence operation: There are no parameters (yet). What the basic self-attention actually does is entirely determined by whatever mechanism creates the input sequence. Upstream mechanisms, like an embedding layer, drive the self-attention by learning representations with particular dot products (although we'll add a few parameters later). Self attention sees its input as a set, not a sequence. If we permute the input sequence, the output sequence will be exactly the same, except permuted also (i.e. self-attention is permutation equivariant). We will mitigate this somewhat when we build the full transformer, but the self-attention by itself actually ignores the sequential nature of the input. In Pytorch: basic self-attention What I cannot create, I do not understand, as Feynman said. So we'll build a simple transformer as we go along. We'll start by implementing this basic self-attention operation in Pytorch. The first thing we should do is work out how to express the self attention in matrix multiplications. A naive implementation that loops over all vectors to compute the weights and outputs would be much too slow. We'll represent the input, a sequence of \(t\) vectors of dimension \(k\) as a \(t\) by \(k\) matrix \(\X\). Including a minibatch dimension \(b\), gives us an input tensor of size \((b, t, k)\). The set of all raw dot products \(w'_{\rc{i}\gc{j}}\) forms a matrix, which we can compute simply by multiplying \(\X\) by its transpose: import torch import torch.nn.functional as F # assume we have some tensor x with size (b, t, k) x = ... raw_weights = torch.bmm(x, x.transpose(1, 2)) # - torch.bmm is a batched matrix multiplication. It # applies matrix multiplication over batches of # matrices. Then, to turn the raw weights \(w'_{\rc{i}\gc{j}}\) into positive values that sum to one, we apply a row-wise softmax: weights = F.softmax(raw_weights, dim=2) Finally, to compute the output sequence, we just multiply the weight matrix by \(\X\). This results in a batch of output matrices \(\Y\) of size (b, t, e) whose rows are weighted sums over the rows of \(\X\). y = torch.bmm(weights, x) That's all. Two matrix multiplications and one softmax gives us a basic self-attention. Additional tricks The actual self-attention used in modern transformers relies on three additional tricks. 1) Queries, keys and values Every input vector \(\x_\rc{i}\) is used in three different ways in the self attention operation: It is compared to every other vector to establish the weights for its own output \(\y_\rc{i}\) It is compared to every other vector to establish the weights for the output of the \(\gc{j}\)-th vector \(\y_\gc{j}\) It is used as part of the weighted sum to compute each output vector once the weights have been established. These roles are often called the query, the key and the value (we'll explain where these names come from later). In the basic self-attention we've seen so far, each input vector must play all three roles. We make its life a little easier by deriving new vectors for each role, by applying a linear transformation to the original input vector. In other words, we add three \(k \times k\) weight matrices \(\W_q\), \(\W_k\),\(\W_v\) and compute three linear transformations of each \(x_\rc{i}\), for the three different parts of the self attention: $$ \begin{align*} \q_\rc{i} &= \W_q\x_\rc{i} & \k_\rc{i} &= \W_k\x_\rc{i} & \v_\rc{i} &= \W_v\x_\rc{i} \end{align*} $$ $$ \begin{align*} w'_{\rc{i}\gc{j}} &= {\q_\rc{i}}^T\k_\gc{j} \\ w_{\rc{i}\gc{j}} &= \text{softmax}(w'_{\rc{i}\gc{j}})\\ \y_\rc{i} &= \sum_\gc{j} w_{\rc{i}\gc{j}} \v_\rc{i}\p\\ \end{align*} $$ This gives the self-attention layer some controllable parameters, and allows it to modify the incoming vectors to suit the three roles they must play. Illustration of the self-attention with key, query and value transformations.2) Scaling the dot product The softmax function can be sensitive to very large input values. These kill the gradient, and slow down learning, or cause it to stop altogether. Since the average value of the dot product grows with the embedding dimension \(k\), it helps to scale the dot product back a little to stop the inputs to the softmax function from growing too large: $$ w'_{\rc{i}\gc{j}} = \frac^T\k_\gc{j}}{\sqrt{k}} $$ Why \(\sqrt{k}\)? Imagine a vector in \({\mathbb R^k}\) with values all \(c\). Its Euclidean length is \(\sqrt{k}c\). Therefore, we are dividing out the amount by which the increase in dimension increases the length of the average vectors.3) Multi-head attention Finally, we must account for the fact that a word can mean different things to different neighbours. Consider the following example. $$\bc{\text{mary}}, \bc{\text{gave}}, \bc{\text{roses}}, \bc{\text{to}}, \bc{\text{susan}}$$ We see that the word gave has different relations to different parts of the sentence. mary expresses who's doing the giving, roses expresses what's being given, and susan expresses who the recipient is. In a single self-attention operation, all this information just gets summed together. If Susan gave Mary the roses instead, the output vector \(\y_\bc{\text{gave}}\) would be the same, even though the meaning has changed. We can give the self attention greater power of discrimination, by combining several self attention mechanisms (which we'll index with \(\bc{r}\)), each with different matrices \(\W_q^\bc{r}\), \(\W_k^\bc{r}\),\(\W_v^\bc{r}\). These are called attention heads. For input \(\x_\rc{i}\) each attention head produces a different output vector \(\y_\rc{i}^\bc{r}\). We concatenate these, and pass them through a linear transformation to reduce the dimension back to \(k\). In Pytorch: complete self-attention Let's now implement a self-attention module with all the bells and whistles. We'll package it into a Pytorch module, so we can reuse it later. Combining three attention heads into one matrix multiplication (for the queries). import torch from torch import nn import torch.nn.functional as F class SelfAttention(nn.Module): def __init__(self, k, heads=8): super().__init__() self.k, self.heads = k, heads We think of the \(h\) attention heads as \(h\) separate sets of three matrices \(\W^\bc{r}_q\), \(\W^\bc{r}_k\),\(\W^\bc{r}_v\), but it's actually more efficient to combine these for all heads into three single \(k \times hk\) matrices, so that we can compute all the concatenated queries, keys and values in a single matrix multiplication. # These compute the queries, keys and values for all # heads (as a single concatenated vector) self.tokeys = nn.Linear(k, k * heads, bias=False) self.toqueries = nn.Linear(k, k * heads, bias=False) self.tovalues = nn.Linear(k, k * heads, bias=False) # This unifies the outputs of the different heads into # a single k-vector self.unifyheads = nn.Linear(heads * emb, emb) We can now implement the computation of the self-attention (the module's forward function). First, we compute the queries, keys and values: def forward(self, x): b, t, k = x.size() h = self.heads queries = self.toqueries(x).view(b, t, h, k) keys = self.tokeys(x) .view(b, t, h, k) values = self.tovalues(x) .view(b, t, h, k) The output of each linear module has size (b, t, h*k), which we simply reshape to (b, t, h, k) give each head its own dimension. Next, we need to compute the dot products. This is the same operation for every head, so we fold the heads into the batch dimension. This ensures that we can use torch.bmm() as before, and the whole collection of keys, queries and values will just be seen as a slightly larger batch. Since the head and batch dimension are not next to each other, we need to transpose before we reshape. (This is costly, but it seems to be unavoidable.) # - fold heads into the batch dimension keys = keys.transpose(1, 2).contiguous().view(b * h, t, k) queries = queries.transpose(1, 2).contiguous().view(b * h, t, k) values = values.transpose(1, 2).contiguous().view(b * h, t, k) As before, the dot products can be computed in a single matrix multiplication, but now between the queries and the keys. # - get dot product of queries and keys, and scale dot = torch.bmm(queries, keys.transpose(1, 2)) # - dot has size (b*h, t, t) containing raw weights dot = F.softmax(dot, dim=2) # - dot now contains row-wise normalized weights We apply the self attention to the values, giving us the output for each attention head # apply the self attention to the values out = torch.bmm(dot, values).view(b, h, t, e) To unify the attention heads, we transpose again, so that the head dimension and the embedding dimension are next to each other, and reshape to get concatenated vectors of dimension \(kh\). We then pass these through the unifyheads layer to project them back down to \(k\) dimensions. # swap h, t back, unify heads out = out.transpose(1, 2).contiguous().view(b, t, h * e) return self.unifyheads(out) And there you have it: multi-head, scaled dot-product self attention. You can see the complete implementation here. Building transformers A transformer is not just a self-attention layer, it is an architecture. It's not quite clear what does and doesn't qualify as a transformer, but here we'll use the following definition: Any architecture designed to process a connected set of units—such as the tokens in a sequence or the pixels in an image—where the only interaction between units is through self-attention. As with other mechanisms, like convolutions, a more or less standard approach has emerged for how to build self-attention layers up into a larger network. The first step is to wrap the self-attention into a block that we can repeat. The transformer block There are some variations on how to build a basic transformer block, but most of them are structured roughly like this: That is, the block applies, in sequence: a self attention layer, layer normalization, a feed forward layer (a single MLP applied independently to each vector), and another layer normalization. Residual connections are added around both, before the normalization. The order of the various components is not set in stone; the important thing is to combine self-attention with a local feedforward, and to add normalization and residual connections. Normalization and residual connections are standard tricks used to help deep neural networks train faster and more accurately. The layer normalization is applied over the embedding dimension only. Here's what the transformer block looks like in pytorch. class TransformerBlock(nn.Module): def __init__(self, k, heads): super().__init__() self.attention = SelfAttention(k, heads=heads) self.norm1 = nn.LayerNorm(k) self.norm2 = nn.LayerNorm(k) self.ff = nn.Sequential( nn.Linear(k, 4 * k), nn.ReLU(), nn.Linear(4 * k, k)) def forward(self, x): attended = self.attention(x) x = self.norm1(attended + x) fedforward = self.ff(x) return self.norm2(fedforward + x) We've made the relatively arbitrary choice of making the hidden layer of the feedforward 4 times as big as the input and output. Smaller values may work as well, and save memory, but it should be bigger than the input/output layers. Classification transformer The simplest transformer we can build is a sequence classifier. We'll use the IMDb sentiment classification dataset: the instances are movie reviews, tokenized into sequences of words, and the classification labels are positive and negative (indicating whether the review was positive or negative about the movie). The heart of the architecture will simply be a large chain of transformer blocks. All we need to do is work out how to feed it the input sequences, and how to transform the final output sequence into a a single classification. The whole experiment can be found here. We won't deal with the data wrangling in this blog post. Follow the links in the code to see how the data is loaded and prepared. Output: producing a classification The most common way to build a sequence classifier out of sequence-to-sequence layers, is to apply global average pooling to the final output sequence, and to map the result to a softmaxed class vector. Overview of a simple sequence classification transformer. The output sequence is averaged to produce a single vector representing the whole sequence. This vector is projected down to a vector with one element per class and softmaxed to produce probabilities. Input: using the positions We've already discussed the principle of an embedding layer. This is what we'll use to represent the words. However, as we've also mentioned already, we're stacking permutation equivariant layers, and the final global average pooling is permutation invariant, so the network as a whole is also permutation invariant. Put more simply: if we shuffle up the words in the sentence, we get the exact same classification, whatever weights we learn. Clearly, we want our state-of-the-art language model to have at least some sensitivity to word order, so this needs to be fixed. The solution is simple: we create a second vector of equal length, that represents the position of the word in the current sentence, and add this to the word embedding. There are two options. position embeddings We simply embed the positions like we did the words. Just like we created embedding vectors \(\v_\bc{\text{cat}}\) and \(\v_\bc{\text{susan}}\), we create embedding vectors \(\v_\bc{\text{12}}\) and \(\v_\bc{\text{25}}\). Up to however long we expect sequences to get. The drawback is that we have to see sequences of every length during training, otherwise the relevant position embeddings don't get trained. The benefit is that it works pretty well, and it's easy to implement. position encodings Position encodings work in the same way as embeddings, except that we don't learn the position vectors, we just choose some function \(f: {\mathbb N} \to {\mathbb R}^k\) to map the positions to real valued vectors, and let the network figure out how to interpret these encodings. The benefit is that for a well chosen function, the network should be able to deal with sequences that are longer than those it's seen during training (it's unlikely to perform well on them, but at least we can check). The drawbacks are that the choice of encoding function is a complicated hyperparameter, and it complicates the implementation a little. For the sake of simplicity, we'll use position embeddings in our implementation. Pytorch Here is the complete text classification transformer in pytorch. class Transformer(nn.Module): def __init__(self, k, heads, depth, seq_length, num_tokens, num_classes): super().__init__() self.num_tokens = num_tokens self.token_emb = nn.Embedding(k, num_tokens) self.pos_emb = nn.Embedding(k, seq_length) # The sequence of transformer blocks that does all the # heavy lifting tblocks = [] for i in range(depth): tblocks.append(TransformerBlock(emb=emb, heads=heads)) self.tblocks = nn.Sequential(*tblocks) # Maps the final output sequence to class logits self.toprobs = nn.Linear(emb, num_classes) def forward(self, x): """ :param x: A (b, t) tensor of integer values representing words (in some predetermined vocabulary). :return: A (b, c) tensor of log-probabilities over the classes (where c is the nr. of classes). """ # generate token embeddings tokens = self.token_emb(x) b, t, e = tokens.size() # generate position embeddings positions = torch.arange(t) positions = self.pos_emb(positions)[None, :, :].expand(b, t, e) x = tokens + positions x = self.tblocks(x) # Average-pool over the t dimension and project to class # probabilities x = self.toprobs(x.mean(dim=1)) return F.log_softmax(x, dim=1) At depth 6, with a maximum sequence length of 512, this transformer achieves an accuracy of about 85%, competitive with results from RNN models, and much faster to train. To see the real near-human performance of transformers, we'd need to train a much deeper mode on much more data. More about how to do that later. Text generation transformer The next trick we'll try is an autoregressive model. We'll train a character level transformer to predict the next character in a sequence. The training regime is simple (and has been around for far longer than transformers have). We give the sequence-to-sequence model a sequence, and we ask it to predict the next character at each point in the sequence. In other words, the target output is the same sequence shifted one character to the left: With RNNs this is all we need to do, since they cannot look forward into the input sequence: output \(i\) depends only on inputs \(0\) to \(i\). With a transformer, the output depends on the entire input sequence, so prediction of the next character becomes vacuously easy, just retrieve it from the input. To use self-attention as an autoregressive model, we'll need to ensure that it cannot look forward into the sequence. We do this by applying a mask to the matrix of dot products, before the softmax is applied. This mask disables all elements above the diagonal of the matrix. Masking the self attention, to ensure that elements can only attend to input elements that precede them in the sequence. Note that the multiplication symbol is slightly misleading: we actually set the masked out elements (the white squares) to \(-\infty\) Since we want these elements to be zero after the softmax, we set them to \(-\infty\). Here's how that looks in pytorch: indices = torch.triu_indices(k, k, offset=0) matrices[:, indices[0], indices[1]] = float('-inf') After we've handicapped the self-attention module like this, the model can no longer look forward in the sequence. We train on the standard enwik8 dataset (taken from the Hutter prize), which contains \(10^8\) characters of Wikipedia text (including markup). During training, we generate batches by randomly sampling subsequences from the data. We train on sequences of length 256, using a model of 12 transformer blocks and 256 embedding dimension. After about 24 hours training on an RTX 2080Ti (some 170K batches of size 32), we let the model generate from a 256-character seed: for each character, we feed it the preceding 256 characters, and look what it predicts for the next character (the last output vector). We sample from that with a temperature of 0.5, and move to the next character. The output looks like this: 1228X Human & Rousseau. Because many of his stories were originally published in long-forgotten magazines and journals, there are a number of [[anthology|anthologies]] by different collators each containing a different selection. His original books have been considered an anthologie in the [[Middle Ages]], and were likely to be one of the most common in the [[Indian Ocean]] in the [[1st century]]. As a result of his death, the Bible was recognised as a counter-attack by the [[Gospel of Matthew]] (1177-1133), and the [[Saxony|Saxons]] of the [[Isle of Matthew]] (1100-1138), the third was a topic of the [[Saxony|Saxon]] throne, and the [[Roman Empire|Roman]] troops of [[Antiochia]] (1145-1148). The [[Roman Empire|Romans]] resigned in [[1148]] and [[1148]] began to collapse. The [[Saxony|Saxons]] of the [[Battle of Valasander]] reported the y Note that the Wikipedia link tag syntax is correctly used, that the text inside the links are represent reasonable subjects for links. Most importantly, note that there is a rough thematic consistency; the generated text keeps on the subject of the bible, and the Roman empire, using different related terms at different points. While this is far form the performance of a model like GPT-2, the benefits over a similar RNN model are clear already: faster training (a similar RNN model would take many days to train) and better long-term coherence. In case you're curious, the Battle of Valasander seems to be an invention of the network. At this point, the model achieves a compression of 1.343 bits per byte on the validation set, which is not too far off the state of the art of 0.93 bits per byte, achieved by the GPT-2 model (described below). Design considerations To understand why transformers are set up this way, it helps to understand the basic design considerations that went into them. The main point of the transformer was to overcome the problems of the previous state-of-the-art architecture, the RNN (usually an LSTM or a GRU). Unrolled, an RNN looks like this: The big weakness here is the recurrent connection. while this allows information to propagate along the sequence, it also means that we cannot compute the cell at time step \(i\) until we've computed the cell at timestep \(i - 1\). Contrast this with a 1D convolution: In this model, every output vector can be computed in parallel with every other output vector. This makes convolutions much faster. The drawback with convolutions, however, is that they're severely limited in modeling long range dependencies. In one convolution layer, only words that are closer together than the kernel size can interact with each other. For longer dependence we need to stack many convolutions. The transformer is an attempt to capture the best of both worlds. They can model dependencies over the whole range of the input sequence just as easily as they can for words that are next to each other (in fact, without the position vectors, they can't even tell the difference). And yet, there are no recurrent connections, so the whole model can be computed in a very efficient feedforward fashion. The rest of the design of the transformer is based primarily on one consideration: depth. Most choices follow from the desire to train big stacks of transformer blocks. Note for instance that there are only two places in the transformer where non-linearities occur: the softmax in the self-attention and the ReLU in the feedforward layer. The rest of the model is entirely composed of linear transformations, which perfectly preserve the gradient. I suppose the layer normalization is also nonlinear, but that is one nonlinearity that actually helps to keep the gradient stable as it propagates back down the network. Historical baggage If you've read other introductions to transformers, you may have noticed that they contain some bits I've skipped. I think these are not necessary to understand modern transformers. They are, however, helpful to understand some of the terminology and some of the writing about modern transformers. Here are the most important ones. Why is it called self-attention? Before self-attention was first presented, sequence models consisted mostly of recurrent networks or convolutions stacked together. At some point, it was discovered that these models could be helped by adding attention mechanisms: instead of feeding the output sequence of the previous layer directly to the input of the next, an intermediate mechanism was introduced, that decided which elements of the input were relevant for a particular word of the output. The general mechanism was as follows. We call the input the values. Some (trainable) mechanism assigns a key to each value. Then to each output, some other mechanism assigns a query. These names derive from the datastructure of a key-value store. In that case we expect only one item in our store to have a key that matches the query, which is returned when the query is executed. Attention is a softened version of this: every key in the store matches the query to some extent. All are returned, and we take a sum, weighted by the extent to which each key matches the query. The great breakthrough of self-attention was that attention by itself is a strong enough mechanism to do all the learning. Attention is all you need, as the authors put it. They key, query and value are all the same vectors (with minor linear transformations). They attend to themselves and stacking such self-attention provides sufficient nonlinearity and representational power to learn very complicated functions. The original transformer: encoders and decoders But the authors did not dispense with all the complexity of contemporary sequence modeling. The standard structure of sequence-to-sequence models in those days was an encoder-decoder architecture, with teacher forcing. The encoder takes the input sequence and maps it to a single latent vector representing the whole sequence. This vector is then passed to a decoder which unpacks it to the desired target sequence (for instance, the same sentence in another language). Teacher forcing refers to the technique of also allowing the the decoder access to the input sentence, but in an autoregressive fashion. That is, the decoder generates the output sentence word for word based both on the latent vector and the words it has already generated. This takes some of the pressure off the latent representation: the decoder can user word-for-word sampling to take care of the low-level structure like syntax and grammer and use the latent vector to capture more high-leve semantic structure. Decoding twice with the same latent vector would, ideally, give you two different sentences with the same meaning. In later transformers, like BERT and GPT-2, the encoder/decoder configuration was entirely dispensed with. A simple stack of transformer blocks was found to be sufficient to achieve state of the art in many sequence based tasks. This is sometimes called a decoder-only transformer (for an autoregressive model) or an encoder-only transformer (for a model without masking). Modern transformers Here's a small selection of some modern transformers and their most characteristic details. BERT was one of the first models to show that transformers could reach human-level performance on a variety of language based tasks: question answering, sentiment classification or classifying whether two sentences naturally follow one another. BERT consists of a simple stack of transformer blocks, of the type we've described above. This stack is pre-trained on a large general-domain corpus consisting of 800M words from English books (modern work, from unpublished authors), and 2.5B words of text from English Wikipedia articles (without markup). Pretraining is done through two tasks: MaskingA certain number of words in the input sequence are: masked out, replaced with a random word or kept as is. The model is then asked to predict, for these words, what the original words were. Note that the model doesn't need to predict the entire denoised sentence, just the modified words. Since the model doesn't know which words it will be asked about, it learns a representation for every word in the sequence. Next sequence classificationTwo sequences of about 256 words are sampled that either (a) follow each other directly in the corpus, or (b) are both taken from random places. The model must then predict whether a or b is the case. BERT uses WordPiece tokenization, which is somewhere in between word-level and character level sequences. It breaks words like walking up into the tokens walk and ##ing. This allows the model to make some inferences based on word structure: two verbs ending in -ing have similar grammatical functions, and two verbs starting with walk- have similar semantic function. The input is prepended with a special token. The output vector corresponding to this token is used as a sentence representation in sequence classification tasks like the next sentence classification (as opposed to the global average pooling over all vectors that we used in our classification model above). After pretraining, a single task-specific layer is placed after the body of transformer blocks, which maps the general purpose representation to a task specific output. For classification tasks, this simply maps the first output token to softmax probabilities over the classes. For more complex tasks, a final sequence-to-sequence layer is designed specifically for the task. The whole model is then re-trained to finetune the model for the specific task at hand. In an ablation experiment, the authors show that the largest improvement as compared to previous models comes from the bidirectional nature of BERT. That is, previous models like GPT used an autoregressive mask, which allowed attention only over previous tokens. The fact that in BERT all attention is over the whole sequence is the main cause of the improved performance. This is why the B in BERT stands for "bidirectional". The largest BERT model uses 24 transformer blocks, an embedding dimension of 1024 and 16 attention heads, resulting in 340M parameters. GPT-2 is the first transformer model that actually made it into the mainstream news, after the controversial decision by OpenAI not to release the full model. The reason was that GPT-2 could generate sufficiently believable text that large-scale fake news campaigns of the kind seen in the 2016 US presidential election would become effectively a one-person job.The first trick that the authors of GPT-2 employed was to create a new high-quality dataset. While BERT used high-quality data (lovingly crafted books and well-edited wikipedia articles) this creates a certain lack of diversity in the writing style. To collect more diverse data without sacrificing quality the authors used the social media site Reddit to find a large collection of writing with a certain minmum level of social support (expressed on Reddit as karma). GPT2 is fundamentally a language generation model, so it uses masked self-attention like we did in our model above. It uses byte-pair encoding to tokenize the language, which , like the WordPiece encoding breaks words up into tokens that are slightly larger than single characters but less than entire words. GPT2 is built very much like our text generation model above, with only small differences in layer order and added tricks to train at greater depths. The largest model uses 48 transformer blocks, a sequence length of 1024 and an embedding dimension of 1600, resulting in 1.5B parameters. They show state-of-the art performance on many tasks. On the wikipedia compression task that we tried above, they achieve 0.93 bits per byte. While the transformer represents a massive leap forward in modeling long-range dependency, the models we have seen so far are still fundamentally limited by the size of the input. Since the size of the dot-product matrix grows quadratically in the sequence length, this quickly becomes the bottleneck as we try to extend the length of the input sequence. Transformer-XL is one of the first succesful transformer models to tackle this problem. During training, a long sequence of text (longer than the model could deal with) is broken up into shorter segments. Each segment is processed in sequence, with self-attention computed over the tokens in the curent segment and the previous segment. Gradients are only computed over the current segment, but information still propagates as the segment window moves through the text. In theory at layer \(n\), information may be used from \(n\) segments ago. A similar trick in RNN training is called truncated backpropagation through time. We feed the model a very long sequence, but backpropagate only over part of it. The first part of the sequence, for which no gradients are computed, still influences the values of the hidden states in the part for which they are. To make this work, the authors had to let go of the standard position encoding/embedding scheme. Since the position encoding is absolute, it would change for each segment and not lead to a consistent embedding over the whole sequence. Instead they use a relative encoding. For each output vector, a different sequence of position vectors is used that denotes not the absolute position, but the distance to the current output. This requires moving the position encoding into the attention mechanism (which is detailed in the paper). One benefit is that the resulting transformer will likely generalize much better to sequences of unseen length. Sparse transformers tackle the problem of quadratic memory use head-on. Instead of computing a dense matrix of attention weights (which grows quadratically), they compute the self-attention only for particular pairs of input tokens, resulting in a sparse attention matrix, with only \(n\sqrt{n}\) explicit elements. This allows models with very large context sizes, for instance for generative modeling over images, with large dependencies between pixels. The tradeoff is that the sparsity structure is not learned, so by the choice of sparse matrix, we are disabling some interactions between input tokens that might otherwise have been useful. However, two units that are not directly related may still interact in higher layers of the transformer (similar to the was a convolutional net builds up a larger receptive field with more convolutional layers). Beyond the simple benefit of training transformers with very large sequence lengths, the sparse transformer also allows a very elegant way of designing an inductive bias. We take our input as a collection of units (words, characters, pixels in an image, nodes in a graph) and we specify, through the sparsity of the attention matrix, which units we believe to be related. The rest is just a matter of building the transformer up as deep as it will go and seeing if it trains. Going big The big bottleneck in training transformers is the matrix of dot products in the self attention. For a sequence length \(t\), this is a dense matrix containing \(t^2\) elements. At standard 32-bit precision, and with \(t=1000\) a batch of 16 such matrices takes up about 250Mb of memory. Since we need at least four of them per self attention operation (before and after softmax, plus their gradients), that limits us to at most twelve layers in a standard 12Gb GPU. In practice, we get even less, since the inputs and outputs also take up a lot of memory (although the dot product dominates). And yet models reported in the literature contain sequence lengths of over 12000, with 48 layers, using dense dot product matrices. These models are trained on clusters, of course, but a single GPU is still required to do a single forward/backward pass. How do we fit such humongous transformers into 12Gb of memory? There are three main tricks: Half precisionOn modern GPUs and on TPUs, tensor computations can be done efficiently on 16-bit float tensors. This isn't quite as simple as just setting the dtype of the tensor to torch.float16. For some parts of the network, like the loss, 32 bit precision is required. But most of this can be handled with relative ease by existing libraries. Practically, this doubles your effective memory. Gradient accumulationFor a large model, we may only be able to perform a forward/backward pass on a single instance. Batch size 1 is not likely to lead to stable learning. Luckily, we can perform a single forward/backward for each instance in a larger batch, and simply sum the gradients we find (this is a consequence of the multivariate chain rule). When we hit the end of the batch, we do a single step of gradient descent, and zero out the gradient. In Pytorch this is particulary easy: you know that optimizer.zero_grad() call in your training loop that seems so superfluous? If you don't make that call, the new gradients are simply added to the old ones. Gradient checkpointingIf your model is so big that even a single forward/backward won't fit in memory, you can trade off even more computation for memory efficiency. In gradient checkpointing, you separate your model into sections. For each section, you do a separate forward/backward to compute the gradients, without retaining the the intermediate values for the rest. Pytorch has special utilities for gradient checkpointng. For more information on how to do this, see this blogpost. Conclusion The transformer may well be the simplest machine learning architecture to dominate the field in decades. There are good reasons to start paying attention to them if you haven't been already. Firstly, the current performance limit is purely in the hardware. Unlike convolutions or LSTMs the current limitations to what they can do are entirely determined by how big a model we can fit in GPU memory and how much data we can push through it in a reaosnable amount of time. I have no doubt, we will eventually hit the point where more layers and and more data won't help anymore, but we don't seem to have reached that point yet. Second, transformers are extremely generic. So far, the big successes have been in language modelling, with some more modest achievements in image and music analysis, but the transformer has a level of generality that is waiting to be exploited. The basic transformer is a set-to-set model. So long as your data is a set of units, you can apply a transformer. Anything else you know about your data (like local structure) you can add by means of position embeddings, or by manipulating the structure of the attention matrix (making it sparse, or masking out parts). This is particularly useful in multi-modal learning. We could easily combine a captioned image into a set of pixels and characters and design some clever embeddings and sparsity structure to help the model figure out how to combine and align the two. If we combine the entirety of our knowledge about our domain into a relational structure like a multi-modal knowledge graph (as discussed in [3]), simple transformer blocks could be employed to propagate information between multimodal units, and to align them with the sparsity structure providing control over which units directly interact. So far, transformers are still primarily seen as a language model. I expect that in time, we'll see them adopted much more in other domains, not just to increase performance, but to simplify existing models, and to allow practitioners more intuitive control over their models' inductive biases. References [1] The illustrated transformer, Jay Allamar. [2] The annotated transformer, Alexander Rush. [3] The knowledge graph as the default data model for learning on heterogeneous knowledge Xander Wilcke, Peter Bloem, Victor de Boer [4] Matrix factorization techniques for recommender systems Yehuda Koren et al.
0 notes