How many parents and professionals have experienced the following scenario? The child in question is reading very fluently (Landi & Ryherd, 2017) but comprehending very little of what s/he is reading. Attempts at remediation follow (oftentimes without the administration of a comprehensive assessment) with a focus on reading texts and answering text-related questions. However, much to everyone’s dismay the problem persists and worsens over time. The child’s mental health suffers as a result since numerous studies show that reading deficits including dyslexia are associated with depression, anxiety, attention, as well as behavioral problems (Arnold et al., 2005; Knivsberg & Andreassen, 2008; Huc-Chabrolle, et al, 2010; Kempe, Gustafson, & Samuelsson, 2011; Boyes, et al, 2016; Livingston et al, 2018). Continue reading Comprehending Reading Comprehension
Category: Phonological Awareness
But is this the Best Practice Recommendation?
Those of you familiar with my blog, know that a number of my posts take on a form of extended responses to posts and comments on social media which deal with certain questionable speech pathology trends and ongoing issues (e.g., controversial diagnostic labels, questionable recommendations, non-evidence based practices, etc.). So, today, I’d like to talk about sweeping general recommendations as pertaining to literacy interventions. Continue reading But is this the Best Practice Recommendation?
Dear Reading Specialist, May I Ask You a Few Questions?
Because the children I assess, often require supplementary reading instruction services, many parents frequently ask me how they can best determine if a reading specialist has the right experience to help their child learn how to read. So today’s blog post describes what type of knowledge reading specialists ought to possess and what type of questions parents (and other professionals) can ask them in order to determine their approaches to treating literacy-related difficulties of struggling learners. Continue reading Dear Reading Specialist, May I Ask You a Few Questions?
Free Literacy Resources for Parents and Professionals
SLPs are constantly on the lookout for good quality affordable materials in the area of literacy. However, what many clinicians may not realize is that there are massive amounts of FREE evidence-based literacy-related resources available online for their use. These materials can be easily adapted or implemented as is, by parents, teachers, speech-language pathologists, as well as other literacy-focused professionals (e.g., tutors, etc.).
Below, I have compiled a rather modest list of my preferred resources (including a few articles) for children aged Pre-K-12 grade pertaining to the following literacy-related areas: Continue reading Free Literacy Resources for Parents and Professionals
Speech, Language, and Literacy Fun with Helen Lester’s Picture Books
Picture books are absolutely wonderful for both assessment and treatment purposes! They are terrific as narrative elicitation aids for children of various ages, ranging from pre-K through fourth grade. They are amazing treatment aids for addressing a variety of speech, language, and literacy goals that extend far beyond narrative production. Continue reading Speech, Language, and Literacy Fun with Helen Lester’s Picture Books
New Products for the 2017 Academic School Year for SLPs
September is quickly approaching and school-based speech language pathologists (SLPs) are preparing to go back to work. Many of them are looking to update their arsenal of speech and language materials for the upcoming academic school year.
With that in mind, I wanted to update my readers regarding all the new products I have recently created with a focus on assessment and treatment in speech language pathology. Continue reading New Products for the 2017 Academic School Year for SLPs
Phonological Awareness Screening App Review: ProPA
Summer is in full swing and for many SLPs that means a welcome break from work. However, for me, it’s business as usual, since my program is year around, and we have just started our extended school year program.
Of course, even my program is a bit light on activities during the summer. There are lots of field trips, creative and imaginative play, as well as less focus on academics as compared to during the school year. However, I’m also highly cognizant of summer learning loss, which is the phenomena characterized by the loss of academic skills and knowledge over the course of summer holidays.
According to Cooper et al, 1996, while generally, typical students lose about one month of learning, there is actually a significant degree of variability of loss based on SES. According to Cooper’s study, low-income students lose approximately two months of achievement. Furthermore, ethnic minorities, twice-exceptional students (2xE), as well as students with language disorders tend to be disproportionately affected (Graham et al, 2011; Kim & Guryan, 2010; Kim, 2004). Finally, it is important to note that according to research, summer loss is particularly prominent in the area of literacy (Graham et al, 2011).
So this summer I have been busy screening the phonological awareness abilities (PA) of an influx of new students (our program enrolls quite a few students during the ESY), as well as rescreening PA abilities of students already on my caseload, who have been receiving services in this area for the past few months.
Why do I intensively focus on phonological awareness (PA)? Because PA is a precursor to emergent reading. It helps children to manipulate sounds in words (see Age of Aquisition of PA Skills). Children need to attain PA mastery (along with a host of a few literacy-related skills) in order to become good readers.
When children exhibit poor PA skills for their age it is a red flag for reading disabilities. Thus it is very important to assess the child’s PA abilities in order to determine their proficiency in this area.
While there are a number of comprehensive tests available in this area, for the purposes of my screening I prefer to use the ProPA app by Smarty Ears.
The Profile of Phonological Awareness (Pro-PA) is an informal phonological awareness screening. According to the developers on average it takes approximately 10 to 20 minutes to administer based on the child’s age and skill levels. In my particular setting (outpatient school based in a psychiatric hospital) it takes approximately 30 minutes to administer to students on the individual basis. It is by no means a comprehensive tool such as the CTOPP-2 or the PAT-2, as there are not enough trials, complexity or PA categories to qualify for a full-blown informal assessment. However, it is a highly useful measure for a quick determination of the students’ strengths and weaknesses with respect to their phonological awareness abilities. Given its current retail price of $29.99 on iTunes, it is a relatively affordable phonological awareness screening option, as the app allows its users to store data, and generates a two-page report at the completion of the screening.
The Pro-PA assesses six different skill areas:
- Rhyming
- Identification
- Production
- Blending
- Syllables
- Sounds
- Sound Isolation
- Initial
- Final
- Medial
- Segmentation
- Words in sentences
- Syllables in words
- Sounds in words
- Words with consonant clusters
- Deletion
- Syllables
- Sounds
- Words with consonant clusters
- Substitution
- Sounds in initial position of words
- Sounds in final position of words
After the completion of the screening, the app generates a two-page report which describes the students’ abilities as:
- Achieved (80%+ accuracy)
- Not achieved (0-50% accuracy)
- Emerging (~50-79% accuracy)
The above is perfect for quickly tracking progress or for generating phonological awareness goals to target the students’ phonological awareness weaknesses. While the report can certainly be provided as an attachment to parents and teachers, I usually tend to summarize its findings in my own reports for the purpose of brevity. Below is one example of what that looks like:
The Profile of Phonological Awareness (Pro-PA), an informal phonological awareness screening was administered to “Justine” in May 2017 to further determine the extent of her phonological awareness strengths and weaknesses.
On the Pro-PA, “Justine” evidenced strengths (80-100% accuracy) in the areas of rhyme identification, initial and final sound isolation in words, syllable segmentation, as well as substitution of sounds in initial position in words.
She also evidenced emerging abilities (~60-66% accuracy) in the areas of syllable and sound blending in words, as well as sound segmentation in CVC words,
However, Pro-PA assessment also revealed weaknesses (inability to perform) in the areas of: rhyme production, isolation of medial sounds in words, segmentation of words, segmentation of sounds in words with consonant blends,deletion of first sounds, consonant clusters, as well as substitution of sounds in final position in words. Continuation of therapeutic intervention is recommended in order to improve “Justine’s” abilities in these phonological awareness areas.
Now you know how I quickly screen and rescreen my students’ phonological awareness abilities, I’d love to hear from you! What screening instruments are you using (free or paid) to assess your students’ phonological awareness abilities? Do you feel that they are more or less comprehensive/convenient than ProPA?
References:
- Cooper, H., Nye, B., Charlton, K., Lindsay, J., & Greathouse, S. (1996). “The effects of summer vacation on achievement test scores: A narrative and meta analytic review.” Review of Educational Research, 66, 227–268.
- Graham, A., McNamara, J. K., & Van Lankveld, J. (2011). Closing the summer learning gap for vulnerable learners: An exploratory study of a summer literacy programme for kindergarten children at-risk for reading difficulties. Early Child Development and Care, 181, 575–585.
- Kim, J. S. (2004). Summer reading and the ethnic achievement gap. Journal of Education for Students Placed
at Risk, 9, 169–188. - Kim, J.,S. & Guryan, J. (2010). The efficacy of a voluntary summer book reading intervention for low-income Latino children from language minority families. Journal of
Educational Psychology, 102(1), 20-31
Review of the Test of Integrated Language and Literacy (TILLS)
The Test of Integrated Language & Literacy Skills (TILLS) is an assessment of oral and written language abilities in students 6–18 years of age. Published in the Fall 2015, it is unique in the way that it is aimed to thoroughly assess skills such as reading fluency, reading comprehension, phonological awareness, spelling, as well as writing in school age children. As I have been using this test since the time it was published, I wanted to take an opportunity today to share just a few of my impressions of this assessment.
First, a little background on why I chose to purchase this test so shortly after I had purchased the Clinical Evaluation of Language Fundamentals – 5 (CELF-5). Soon after I started using the CELF-5 I noticed that it tended to considerably overinflate my students’ scores on a variety of its subtests. In fact, I noticed that unless a student had a fairly severe degree of impairment, the majority of his/her scores came out either low/slightly below average (click for more info on why this was happening HERE, HERE, or HERE). Consequently, I was excited to hear regarding TILLS development, almost simultaneously through ASHA as well as SPELL-Links ListServe. I was particularly happy because I knew some of this test’s developers (e.g., Dr. Elena Plante, Dr. Nickola Nelson) have published solid research in the areas of psychometrics and literacy respectively.
According to the TILLS developers it has been standardized for 3 purposes:
- to identify language and literacy disorders
- to document patterns of relative strengths and weaknesses
- to track changes in language and literacy skills over time
The testing subtests can be administered in isolation (with the exception of a few) or in its entirety. The administration of all the 15 subtests may take approximately an hour and a half, while the administration of the core subtests typically takes ~45 mins).
Please note that there are 5 subtests that should not be administered to students 6;0-6;5 years of age because many typically developing students are still mastering the required skills.
- Subtest 5 – Nonword Spelling
- Subtest 7 – Reading Comprehension
- Subtest 10 – Nonword Reading
- Subtest 11 – Reading Fluency
- Subtest 12 – Written Expression
However, if needed, there are several tests of early reading and writing abilities which are available for assessment of children under 6:5 years of age with suspected literacy deficits (e.g., TERA-3: Test of Early Reading Ability–Third Edition; Test of Early Written Language, Third Edition-TEWL-3, etc.).
Let’s move on to take a deeper look at its subtests. Please note that for the purposes of this review all images came directly from and are the property of Brookes Publishing Co (clicking on each of the below images will take you directly to their source).
1. Vocabulary Awareness (VA) (description above) requires students to display considerable linguistic and cognitive flexibility in order to earn an average score. It works great in teasing out students with weak vocabulary knowledge and use, as well as students who are unable to quickly and effectively analyze words for deeper meaning and come up with effective definitions of all possible word associations. Be mindful of the fact that even though the words are presented to the students in written format in the stimulus book, the examiner is still expected to read all the words to the students. Consequently, students with good vocabulary knowledge and strong oral language abilities can still pass this subtest despite the presence of significant reading weaknesses. Recommendation: I suggest informally checking the student’s word reading abilities by asking them to read of all the words, before reading all the word choices to them. This way you can informally document any word misreadings made by the student even in the presence of an average subtest score.
2. The Phonemic Awareness (PA) subtest (description above) requires students to isolate and delete initial sounds in words of increasing complexity. While this subtest does not require sound isolation and deletion in various word positions, similar to tests such as the CTOPP-2: Comprehensive Test of Phonological Processing–Second Edition or the The Phonological Awareness Test 2 (PAT 2), it is still a highly useful and reliable measure of phonemic awareness (as one of many precursors to reading fluency success). This is especially because after the initial directions are given, the student is expected to remember to isolate the initial sounds in words without any prompting from the examiner. Thus, this task also indirectly tests the students’ executive function abilities in addition to their phonemic awareness skills.
3. The Story Retelling (SR) subtest (description above) requires students to do just that retell a story. Be mindful of the fact that the presented stories have reduced complexity. Thus, unless the students possess significant retelling deficits, the above subtest may not capture their true retelling abilities. Recommendation: Consider supplementing this subtest with informal narrative measures. For younger children (kindergarten and first grade) I recommend using wordless picture books to perform a dynamic assessment of their retelling abilities following a clinician’s narrative model (e.g., HERE). For early elementary aged children (grades 2 and up), I recommend using picture books, which are first read to and then retold by the students with the benefit of pictorial but not written support. Finally, for upper elementary aged children (grades 4 and up), it may be helpful for the students to retell a book or a movie seen recently (or liked significantly) by them without the benefit of visual support all together (e.g., HERE).
4. The Nonword Repetition (NR) subtest (description above) requires students to repeat nonsense words of increasing length and complexity. Weaknesses in the area of nonword repetition have consistently been associated with language impairments and learning disabilities due to the task’s heavy reliance on phonological segmentation as well as phonological and lexical knowledge (Leclercq, Maillart, Majerus, 2013). Thus, both monolingual and simultaneously bilingual children with language and literacy impairments will be observed to present with patterns of segment substitutions (subtle substitutions of sounds and syllables in presented nonsense words) as well as segment deletions of nonword sequences more than 2-3 or 3-4 syllables in length (depending on the child’s age).
5. The Nonword Spelling (NS) subtest (description above) requires the students to spell nonwords from the Nonword Repetition (NR) subtest. Consequently, the Nonword Repetition (NR) subtest needs to be administered prior to the administration of this subtest in the same assessment session. In contrast to the real-word spelling tasks, students cannot memorize the spelling of the presented words, which are still bound by orthographic and phonotactic constraints of the English language. While this is a highly useful subtest, is important to note that simultaneously bilingual children may present with decreased scores due to vowel errors. Consequently, it is important to analyze subtest results in order to determine whether dialectal differences rather than a presence of an actual disorder is responsible for the error patterns.
6. The Listening Comprehension (LC) subtest (description above) requires the students to listen to short stories and then definitively answer story questions via available answer choices, which include: “Yes”, “No’, and “Maybe”. This subtest also indirectly measures the students’ metalinguistic awareness skills as they are needed to detect when the text does not provide sufficient information to answer a particular question definitively (e.g., “Maybe” response may be called for). Be mindful of the fact that because the students are not expected to provide sentential responses to questions it may be important to supplement subtest administration with another listening comprehension assessment. Tests such as the Listening Comprehension Test-2 (LCT-2), the Listening Comprehension Test-Adolescent (LCT-A), or the Executive Function Test-Elementary (EFT-E) may be useful if language processing and listening comprehension deficits are suspected or reported by parents or teachers. This is particularly important to do with students who may be ‘good guessers’ but who are also reported to present with word-finding difficulties at sentence and discourse levels.
7. The Reading Comprehension (RC) subtest (description above) requires the students to read short story and answer story questions in “Yes”, “No’, and “Maybe” format. This subtest is not stand alone and must be administered immediately following the administration the Listening Comprehension subtest. The student is asked to read the first story out loud in order to determine whether s/he can proceed with taking this subtest or discontinue due to being an emergent reader. The criterion for administration of the subtest is making 7 errors during the reading of the first story and its accompanying questions. Unfortunately, in my clinical experience this subtest is not always accurate at identifying children with reading-based deficits.
While I find it terrific for students with severe-profound reading deficits and/or below average IQ, a number of my students with average IQ and moderately impaired reading skills managed to pass it via a combination of guessing and luck despite being observed to misread aloud between 40-60% of the presented words. Be mindful of the fact that typically such students may have up to 5-6 errors during the reading of the first story. Thus, according to administration guidelines these students will be allowed to proceed and take this subtest. They will then continue to make text misreadings during each story presentation (you will know that by asking them to read each story aloud vs. silently). However, because the response mode is in definitive (“Yes”, “No’, and “Maybe”) vs. open ended question format, a number of these students will earn average scores by being successful guessers. Recommendation: I highly recommend supplementing the administration of this subtest with grade level (or below grade level) texts (see HERE and/or HERE), to assess the student’s reading comprehension informally.
I present a full one page text to the students and ask them to read it to me in its entirety. I audio/video record the student’s reading for further analysis (see Reading Fluency section below). After the completion of the story I ask the student questions with a focus on main idea comprehension and vocabulary definitions. I also ask questions pertaining to story details. Depending on the student’s age I may ask them abstract/ factual text questions with and without text access. Overall, I find that informal administration of grade level (or even below grade-level) texts coupled with the administration of standardized reading tests provides me with a significantly better understanding of the student’s reading comprehension abilities rather than administration of standardized reading tests alone.
8. The Following Directions (FD) subtest (description above) measures the student’s ability to execute directions of increasing length and complexity. It measures the student’s short-term, immediate and working memory, as well as their language comprehension. What is interesting about the administration of this subtest is that the graphic symbols (e.g., objects, shapes, letter and numbers etc.) the student is asked to modify remain covered as the instructions are given (to prevent visual rehearsal). After being presented with the oral instruction the students are expected to move the card covering the stimuli and then to executive the visual-spatial, directional, sequential, and logical if–then the instructions by marking them on the response form. The fact that the visual stimuli remains covered until the last moment increases the demands on the student’s memory and comprehension. The subtest was created to simulate teacher’s use of procedural language (giving directions) in classroom setting (as per developers).
9. The Delayed Story Retelling (DSR) subtest (description above) needs to be administered to the students during the same session as the Story Retelling (SR) subtest, approximately 20 minutes after the SR subtest administration. Despite the relatively short passage of time between both subtests, it is considered to be a measure of long-term memory as related to narrative retelling of reduced complexity. Here, the examiner can compare student’s performance to determine whether the student did better or worse on either of these measures (e.g., recalled more information after a period of time passed vs. immediately after being read the story). However, as mentioned previously, some students may recall this previously presented story fairly accurately and as a result may obtain an average score despite a history of teacher/parent reported long-term memory limitations. Consequently, it may be important for the examiner to supplement the administration of this subtest with a recall of a movie/book recently seen/read by the student (a few days ago) in order to compare both performances and note any weaknesses/limitations.
10. The Nonword Reading (NR) subtest (description above) requires students to decode nonsense words of increasing length and complexity. What I love about this subtest is that the students are unable to effectively guess words (as many tend to routinely do when presented with real words). Consequently, the presentation of this subtest will tease out which students have good letter/sound correspondence abilities as well as solid orthographic, morphological and phonological awareness skills and which ones only memorized sight words and are now having difficulty decoding unfamiliar words as a result.
11. The Reading Fluency (RF) subtest (description above) requires students to efficiently read facts which make up simple stories fluently and correctly. Here are the key to attaining an average score is accuracy and automaticity. In contrast to the previous subtest, the words are now presented in meaningful simple syntactic contexts.
It is important to note that the Reading Fluency subtest of the TILLS has a negatively skewed distribution. As per authors, “a large number of typically developing students do extremely well on this subtest and a much smaller number of students do quite poorly.”
Thus, “the mean is to the left of the mode” (see publisher’s image below). This is why a student could earn an average standard score (near the mean) and a low percentile rank when true percentiles are used rather than NCE percentiles (Normal Curve Equivalent).
Consequently under certain conditions (See HERE) the percentile rank (vs. the NCE percentile) will be a more accurate representation of the student’s ability on this subtest.
Indeed, due to the reduced complexity of the presented words some students (especially younger elementary aged) may obtain average scores and still present with serious reading fluency deficits.
I frequently see that in students with average IQ and go to long-term memory, who by second and third grades have managed to memorize an admirable number of sight words due to which their deficits in the areas of reading appeared to be minimized. Recommendation: If you suspect that your student belongs to the above category I highly recommend supplementing this subtest with an informal measure of reading fluency. This can be done by presenting to the student a grade level text (I find science and social studies texts particularly useful for this purpose) and asking them to read several paragraphs from it (see HERE and/or HERE).
As the students are reading I calculate their reading fluency by counting the number of words they read per minute. I find it very useful as it allows me to better understand their reading profile (e.g, fast/inaccurate reader, slow/inaccurate reader, slow accurate reader, fast/accurate reader). As the student is reading I note their pauses, misreadings, word-attack skills and the like. Then, I write a summary comparing the students reading fluency on both standardized and informal assessment measures in order to document students strengths and limitations.
12. The Written Expression (WE) subtest (description above) needs to be administered to the students immediately after the administration of the Reading Fluency (RF) subtest because the student is expected to integrate a series of facts presented in the RF subtest into their writing sample. There are 4 stories in total for the 4 different age groups.
The examiner needs to show the student a different story which integrates simple facts into a coherent narrative. After the examiner reads that simple story to the students s/he is expected to tell the students that the story is okay, but “sounds kind of “choppy.” They then need to show the student an example of how they could put the facts together in a way that sounds more interesting and less choppy by combining sentences (see below). Finally, the examiner will ask the students to rewrite the story presented to them in a similar manner (e.g, “less choppy and more interesting.”)
After the student finishes his/her story, the examiner will analyze it and generate the following scores: a discourse score, a sentence score, and a word score. Detailed instructions as well as the Examiner’s Practice Workbook are provided to assist with scoring as it takes a bit of training as well as trial and error to complete it, especially if the examiners are not familiar with certain procedures (e.g., calculating T-units).
Full disclosure: Because the above subtest is still essentially sentence combining, I have only used this subtest a handful of times with my students. Typically when I’ve used it in the past, most of my students fell in two categories: those who failed it completely by either copying text word for word, failing to generate any written output etc. or those who passed it with flying colors but still presented with notable written output deficits. Consequently, I’ve replaced Written Expression subtest administration with the administration of written standardized tests, which I supplement with an informal grade level expository, persuasive, or narrative writing samples.
Having said that many clinicians may not have the access to other standardized written assessments, or lack the time to administer entire standardized written measures (which may frequently take between 60 to 90 minutes of administration time). Consequently, in the absence of other standardized writing assessments, this subtest can be effectively used to gauge the student’s basic writing abilities, and if needed effectively supplemented by informal writing measures (mentioned above).
13. The Social Communication (SC) subtest (description above) assesses the students’ ability to understand vocabulary associated with communicative intentions in social situations. It requires students to comprehend how people with certain characteristics might respond in social situations by formulating responses which fit the social contexts of those situations. Essentially students become actors who need to act out particular scenes while viewing select words presented to them.
Full disclosure: Similar to my infrequent administration of the Written Expression subtest, I have also administered this subtest very infrequently to students. Here is why.
I am an SLP who works full-time in a psychiatric hospital with children diagnosed with significant psychiatric impairments and concomitant language and literacy deficits. As a result, a significant portion of my job involves comprehensive social communication assessments to catalog my students’ significant deficits in this area. Yet, past administration of this subtest showed me that number of my students can pass this subtest quite easily despite presenting with notable and easily evidenced social communication deficits. Consequently, I prefer the administration of comprehensive social communication testing when working with children in my hospital based program or in my private practice, where I perform independent comprehensive evaluations of language and literacy (IEEs).
Again, as I’ve previously mentioned many clinicians may not have the access to other standardized social communication assessments, or lack the time to administer entire standardized written measures. Consequently, in the absence of other social communication assessments, this subtest can be used to get a baseline of the student’s basic social communication abilities, and then be supplemented with informal social communication measures such as the Informal Social Thinking Dynamic Assessment Protocol (ISTDAP) or observational social pragmatic checklists.
14. The Digit Span Forward (DSF) subtest (description above) is a relatively isolated measure of short term and verbal working memory ( it minimizes demands on other aspects of language such as syntax or vocabulary).
15. The Digit Span Backward (DSB) subtest (description above) assesses the student’s working memory and requires the student to mentally manipulate the presented stimuli in reverse order. It allows examiner to observe the strategies (e.g. verbal rehearsal, visual imagery, etc.) the students are using to aid themselves in the process. Please note that the Digit Span Forward subtest must be administered immediately before the administration of this subtest.
SLPs who have used tests such as the Clinical Evaluation of Language Fundamentals – 5 (CELF-5) or the Test of Auditory Processing Skills – Third Edition (TAPS-3) should be highly familiar with both subtests as they are fairly standard measures of certain aspects of memory across the board.
To continue, in addition to the presence of subtests which assess the students literacy abilities, the TILLS also possesses a number of interesting features.
For starters, the TILLS Easy Score, which allows the examiners to use their scoring online. It is incredibly easy and effective. After clicking on the link and filling out the preliminary demographic information, all the examiner needs to do is to plug in this subtest raw scores, the system does the rest. After the raw scores are plugged in, the system will generate a PDF document with all the data which includes (but is not limited to) standard scores, percentile ranks, as well as a variety of composite and core scores. The examiner can then save the PDF on their device (laptop, PC, tablet etc.) for further analysis.
The there is the quadrant model. According to the TILLS sampler (HERE) “it allows the examiners to assess and compare students’ language-literacy skills at the sound/word level and the sentence/ discourse level across the four oral and written modalities—listening, speaking, reading, and writing” and then create “meaningful profiles of oral and written language skills that will help you understand the strengths and needs of individual students and communicate about them in a meaningful way with teachers, parents, and students. (pg. 21)”
Then there is the Student Language Scale (SLS) which is a one page checklist parents, teachers (and even students) can fill out to informally identify language and literacy based strengths and weaknesses. It allows for meaningful input from multiple sources regarding the students performance (as per IDEA 2004) and can be used not just with TILLS but with other tests or in even isolation (as per developers).
Furthermore according to the developers, because the normative sample included several special needs populations, the TILLS can be used with students diagnosed with ASD, deaf or hard of hearing (see caveat), as well as intellectual disabilities (as long as they are functioning age 6 and above developmentally).
According to the developers the TILLS is aligned with Common Core Standards and can be administered as frequently as two times a year for progress monitoring (min of 6 mos post 1st administration).
With respect to bilingualism examiners can use it with caution with simultaneous English learners but not with sequential English learners (see further explanations HERE). Translations of TILLS are definitely not allowed as they will undermine test validity and reliability.
So there you have it these are just some of my very few impressions regarding this test. Now to some of you may notice that I spend a significant amount of time pointing out some of the tests limitations. However, it is very important to note that we have research that indicates that there is no such thing as a “perfect standardized test” (see HERE for more information). All standardized tests have their limitations.
Having said that, I think that TILLS is a PHENOMENAL addition to the standardized testing market, as it TRULY appears to assess not just language but also literacy abilities of the students on our caseloads.
That’s all from me; however, before signing off I’d like to provide you with more resources and information, which can be reviewed in reference to TILLS. For starters, take a look at Brookes Publishing TILLS resources. These include (but are not limited to) TILLS FAQ, TILLS Easy-Score, TILLS Correction Document, as well as 3 FREE TILLS Webinars. There’s also a Facebook Page dedicated exclusively to TILLS updates (HERE).
But that’s not all. Dr. Nelson and her colleagues have been tirelessly lecturing about the TILLS for a number of years, and many of their past lectures and presentations are available on the ASHA website as well as on the web (e.g., HERE, HERE, HERE, etc). Take a look at them as they contain far more in-depth information regarding the development and implementation of this groundbreaking assessment.
To access TILLS fully-editable template, click HERE
Disclaimer: I did not receive a complimentary copy of this assessment for review nor have I received any encouragement or compensation from either Brookes Publishing or any of the TILLS developers to write it. All images of this test are direct property of Brookes Publishing (when clicked on all the images direct the user to the Brookes Publishing website) and were used in this post for illustrative purposes only.
References:
Leclercq A, Maillart C, Majerus S. (2013) Nonword repetition problems in children with SLI: A deficit in accessing long-term linguistic representations? Topics in Language Disorders. 33 (3) 238-254.
Related Posts:
- Components of Comprehensive Dyslexia Testing: Part I- Introduction and Language Testing
- Part II: Components of Comprehensive Dyslexia Testing – Phonological Awareness and Word Fluency Assessment
- Part III: Components of Comprehensive Dyslexia Testing – Reading Fluency and Reading Comprehension
- Part IV: Components of Comprehensive Dyslexia Testing – Writing and Spelling
- Special Education Disputes and Comprehensive Language Testing: What Parents, Attorneys, and Advocates Need to Know
- Why (C) APD Diagnosis is NOT Valid!
- What Are Speech Pathologists To Do If the (C)APD Diagnosis is NOT Valid?
- What do Auditory Memory Deficits Indicate in the Presence of Average General Language Scores?
- Why Are My Child’s Test Scores Dropping?
- Comprehensive Assessment of Adolescents with Suspected Language and Literacy Disorders
Teaching Metalinguistic Vocabulary for Reading Success
In my therapy sessions I spend a significant amount of time improving literacy skills (reading, spelling, and writing) of language impaired students. In my work with these students I emphasize goals with a focus on phonics, phonological awareness, encoding (spelling) etc. However, what I have frequently observed in my sessions are significant gaps in the students’ foundational knowledge pertaining to the basics of sound production and letter recognition. Basic examples of these foundational deficiencies involve students not being able to fluently name the letters of the alphabet, understand the difference between vowels and consonants, or fluently engage in sound/letter correspondence tasks (e.g., name a letter and then quickly and accurately identify which sound it makes). Consequently, a significant portion of my sessions involves explicit instruction of the above concepts.
This got me thinking regarding my students’ vocabulary knowledge in general. We, SLPs, spend a significant amount of time on explicit and systematic vocabulary instruction with our students because as compared to typically developing peers, they have immature and limited vocabulary knowledge. But do we teach our students the abstract vocabulary necessary for reading success? Do we explicitly teach them definitions of a letter, a word, a sentence? etc.
A number of my colleagues are skeptical. “Our students already have poor comprehension”, they tell me, “Why should we tax their memory with abstract words of little meaning to them?” And I agree with them of course, but up to a point.
I agree that our students have working memory and processing speed deficits as a result of which they have a much harder time learning and recalling new words.
However, I believe that not teaching them meanings of select words pertaining to language is a huge disservice to them. Here is why. To be a successful communicator, speaker, reader, and writer, individuals need to possess adequate metalinguistic skills.
In simple terms “metalinguistics” refers to the individual’s ability to actively think about, talk about, and manipulate language. Reading, writing, and spelling require active level awareness and thought about language. Students with poor metalinguistic skills have difficulty learning to read, write, and spell. They lack awareness that spoken words are made up of individual units of sound, which can be manipulated. They lack awareness that letters form words, words form phrases and sentences, and sentences form paragraphs. They may not understand that letters make sounds or that a word may consist of more letters than sounds (e.g., /ship/). The bottom line is that students with decreased metalinguistic skills cannot effectively use language to talk about concepts like sounds, letters, or words unless they are explicitly taught those abilities.
So I do! Furthermore, I can tell you that explicit instruction of metalinguistic vocabulary does significantly improve my students understanding of the tasks involved in obtaining literacy competence. Even my students with mild to moderate intellectual disabilities significantly benefit from understanding the meanings of: letters, words, sentences, etc.
I even created a basic abstract vocabulary handout to facilitate my students comprehension of these words (FREE HERE). While by no means exhaustive, it is a decent starting point for teaching my students the vocabulary needed to improve their metalinguistic skills.
For older elementary aged students with average IQ, I only provide the words I want them to define, and then ask them to look up their meanings online via the usage of PC or an iPad. This turns of vocabulary activity into a critical thinking and an executive functions task.
Students need to figure out the appropriate search string needed to in order to locate the answer as well as which definition comes the closest to clearly and effectively defining the presented word. One of the things I really like about Google online dictionary, is that it provides multiple definitions of the same words along with word origins. As a result, it teaches students to carefully review and reflect upon their selected definition in order to determine its appropriateness.
A word of caution as though regarding using Kiddle, Google-powered search engine for children. While it’s great for locating child friendly images, it is not appropriate for locating abstract definition of words. To illustrate, when you type in the string search into Google, “what is the definition of a letter?” You will get several responses which will appropriately match some meanings of your query. However the same string search in Kiddle, will merely yield helpful tips on writing a letter as well as images of envelopes with stamps affixed to them.
In contrast to the above, I use a more structured vocabulary defining activities for younger elementary age students as well as students with intellectual impairments. I provide simple definitions of abstract words, attach images and examples to each definition as well as create cloze activities and several choices of answers in order to ensure my students’ comprehension of these words.
I find that this and other metalinguistic activities significantly improve my students comprehension of abstract words such as ‘communication’, ‘language’, as well as ‘literacy’. They cease being mere buzzwords, frequently heard yet consistently not understood. To my students these words begin to come to life, brim with meaning, and inspire numerous ‘aha’ moments.
Now that you’ve had a glimpse of my therapy sessions I’d love to have a glimpse of yours. What metalinguistic goals related to literacy are you targeting with your students? Comment below to let me know.
Part II: Components of Comprehensive Dyslexia Testing – Phonological Awareness and Word Fluency Assessment
A few days ago I posted my first installment in the comprehensive assessment of dyslexia series, discussing common dyslexia myths as well as general language testing as a starting point in the dyslexia testing battery. (You can find this post HERE).
Today I would like to discuss the next two steps in dyslexia assessment, which are phonological awareness and word fluency testing.
Let’s begin with phonological awareness (PA). Phonological awareness is a precursor to emergent reading. It allows children to understand and manipulate sounds in order to form or breakdown words. It’s one of those interesting types of knowledge, which is a prerequisite to everything and is definitive of nothing. I like to compare it to taking a statistics course in college. You need it as a prerequisite to entering a graduate speech pathology program but just because you successfully complete it does not mean that you will graduate the program. Similarly, the children need to have phonological awareness mastery in order to move on and build upon existing skills to become emergent readers, however, simply having this mastery does not a good reader make (hence this is only one of the tests in dyslexia battery).
When a child has poor phonological awareness for his/her age it is a red flag for reading disabilities. Thus it is very important to assess the child’s ability to successfully manipulate sounds (e.g., by isolating, segmenting, blending, etc.,) in order to produce real or nonsense words.
Why are nonsense words important?
According to Shaywitz (2003), “The ability to read nonsense words is the best measure of phonological decoding skill in children.” (p. 133-134) Being able to decode and manipulate (blend, segment, etc.) nonsense words is a good indication that the child is acquiring comprehension of the alphabetic principle (understands sound letter correspondence or what common sounds are made by specific letters). It is a very important part of a dyslexia battery since nonsense words cannot be memorized or guessed but need to be “truly decoded.”
While a number of standardized tests assess phonological awareness skills, my personal preference is the Comprehensive Test of Phonological Processing-2 (CTOPP-2), which assesses the following areas:
- Phonological Segmentation
- Blending Words
- Sound Matching
- Initial, Medial and Final Phoneme Isolation
- Blending Nonwords
- Segmenting Nonwords
- Memory for Digits
- Nonword Repetition
- Rapid Digit Naming
- Rapid Letter Naming
- Rapid Color Naming
- Rapid Object Naming
As you can see from above description, it not only assesses the children’s ability to manipulate real words but also their ability to manipulate nonsense words. It also assesses word fluency skills via a host of rapid naming tasks, so it’s a very convenient tool to have as part of your dyslexia testing battery.
This brings us to another integral part of the dyslexia testing battery which is word fluency testing (WF). During word fluency tasks a child is asked to rapidly generate words on a particular topic given timed constraints (e.g., name as many animals as you can in 1 minute, etc.). We test this rapid naming ability because we want to see how quickly and accurately the child can process information. This ability is very much needed to become a fluent reader.
Poor readers can name a number of items but they may not be able to efficiently categorize these words. Furthermore, they will produce the items with a significantly decreased processing speed as compared to good readers. Decreased word fluency is a significant indicator of reading deficits. It is frequently observable in children with reading disabilities when they encounter a text with which they lack familiarity. That is why this ability is very important to test.
Several tests can be used for this purpose including CTOPP-2 and Rapid Automatized Naming and Rapid Alternating Stimulus Test (RAN/RAS) just to name a few. However, since CTOPP-2 already has a number of subtests which deal with testing this skill, I prefer to use it to test both phonological awareness and word fluency.
Read part III of this series which discusses components of Reading Fluency and Reading Comprehension testing HERE.