Impact Factors: Some getting started information

While conducting scholarly research you’ve probably seen mentions of ‘impact factors’. We’ll use today’s post to provide a little more information and to link you out to some great resources to learn more.

Generally speaking, impact factors are measures (using varying metrics but often citation counts) that suggest the ‘importance’ or ‘significance’ of either an author, article, or journal. Various stakeholders like to use these impact factors as proof that either their work or their journal or their institution, etc. is valuable to the research community.

There are many types of impact factors and each is calculated differently. There are measures for authors which use formulas based on how many articles they’ve published and how many times each of those articles have been cited. There are other measures for journals which might look at how often a typical article in that publication is cited each year, and so on.

A couple common author-level impact measures include:

  • H-Index: An author-level metric used to calculate an individual scholar’s research impact.  Check out this guide from Boston College to learn more.
  • Altmetric: As described on the Altmetric website, “Altmetrics are metrics and qualitative data that are complementary to traditional, citation-based metrics. They can include (but are not limited to) peer reviews on Faculty of 1000, citations on Wikipedia and in public policy documents, discussions on research blogs, mainstream media coverage, bookmarks on reference managers like Mendeley, and mentions on social networks such as Twitter.”  Check out our blog post about Altmetrics for more info.

What gets a little complex, of course, is whether these formulas/calculations are accurate measures of something like ‘impact’. Some of the formulas will include self-citations which can skew results. Others have argued that citation counts do not reflect the full ‘impact’ of a work. There is a recent movement in ‘altmetrics’ to track impact in other spaces such as social media, as mentioned above, to give a more complete picture of how a work is being used and discussed.

Some tools, like Google Scholar’s author profiles, will include a list of author-level citation metrics. You can view this data on an author’s profile page and if you hover over the name of the metric you can reveal a definition of how it was calculated:

Screenshot of author profile

There can be a TON to know about impact factors and citation metrics and some of it can get quite complicated. UC Irvine  has a wonderful LibGuide which provides a lot of this information is easily digestible chunks: http://guides.lib.uci.edu/c.php?g=334451&p=2249950.

The University of Illinois also has a great guide which covers journal impact factor in particular: http://researchguides.uic.edu/if/impact. Lastly, Elsevier has put together a nice list of various impact measures and their formulas: https://www.elsevier.com/authors/journal-authors/measuring-a-journals-impact.

Next time you see a journal advertising its impact factor, or you’re trying to understand the influence of a scholar’s research in their field, consider drawing on some of these tools to gain better insight into what those impact factors and influences might mean.

Happy Searching!

PsycTESTS — a great portal to full-text tests, measures, and scales

The PsycTESTS database is a great resource for researchers looking for full-text tests and measures and more information about them. While ‘Psyc’ might be in the name, this an excellent tool for students from all programs, offering access to tests/measurements/scales related to everything from political attitudes to racial bias to career aspirations.

In this post, we’ll cover the basics of searching in PsycTESTs and try to answer the dreaded ‘what if full text is unavailable?’ question.

My Nielsen Questionnaire

Photo by Joe Gratz. CC license.

Basic Searching

You can connect to PsycTESTS from the ‘Databases’ link on the library homepage.  As with most databases, you can search for tests/measures containing certain keywords by entering a term or two into the search box. *Note: you can choose how you would like your results sorted on the search page:

Screenshot of search page

 

You can then choose to scroll through results or use the filters on the left side of the screen to refine the search further.  If a record includes a copy of the instrument itself you will see a ‘Test’ link accompanied by a PDF icon.  Click on ‘Test’ to open the PDF copy:

Screenshot of 'test' icon

No ‘Test’ Link Available?

Many researchers feel a sense of dread when there is no full-text link available for the measurement they seek.  However, PsycTESTS includes information that can often easily lead you to the measurement you need.  We’ll outline some simple steps to follow.

For this example, let’s imagine you want to access the ‘Political Ideology Measure’ from the screenshot above. The first step is to click on the name of the test itself so you can view the full record in the database.

Next, scroll down to a section labeled ‘Test Development Record’:

Screenshot of test development record

Just below the ‘Test Development Record’ you will see the heading ‘Reported In’.  This is a citation for the work in which the test/measure was originally reported.  Even though PsycTESTS doesn’t have a full copy of the test, you can consult the original article for a copy (assuming they included it).

**Note: You can click on the ‘Test Development Record’ to bring up a full list of information about the test including reliability/validity, author contact information, whether it is commercially licensed, and more:

Once you know the original reporting article, the savvy researcher might connect to Google Scholar through the library to quickly determine whether we have full-text access to the work:

Screenshot of Google Scholar check

In the event you get this far and can’t find a full-text copy of the original article, remember you can always submit an interlibrary loan request for a copy by following the ‘Order An Article’ link on the library homepage.

Happy Searching!

View on Reviews

As you may know, there are many types of reviews in academic literature. Systematic reviews, meta-analyses, scoping reviews, literature reviews, and more! With such nuanced differences it can sometimes be hard to know what type of review would be most helpful in your research, or what type of review you’d like to write.

In this post, we’d like to link out to some great guides and resources which can help you better understand these differences.  We’ll also include a couple of sample searches illustrating how to best locate reviews in Fielding’s library.

Web Resources

  • Types of Reviews Chart — This comprehensive chart is posted to a LibGuide created by Duke University’s Medical library.  It does a great job of breaking down and defining many review types.
  • Literature Review Guide with E-Lectures — Created by Harvard’s Graduate School of Education’s Gutman library, this outstanding guide walks through the literature review process and is accompanied by short e-lectures.  This is a highly recommended starting point to learn more about conducting literature reviews.
  • Scoping Reviews Wiki  — Created by a group of health librarians in Canada, this wiki page contains excellent information about scoping reviews (as do their many other pages on various topics)!

Resources in the Library

Don’t forget that you can also find great information about how to do reviews, or sample reviews, in Fielding’s library collection.

The best place to locate materials which offer definitions and guidance on conducting reviews is our Sage Research Methods Online database.

Once you connect via our ‘databases’ list you can simply search for a review type to find related materials:

Screenshot of lit review search

This will often produce a concise definition along with a number of related results. Remember, you can always click on the ‘ See more in Methods Map’ link when available to see how your search term relates to other methods and ideas:

Tips on locating sample reviews

While some databases include a ‘document type’ or ‘methodology’ filter which allows you to limit your results to reviews, by and large one of the most effective strategies is to include search terms related to the review type of interest in your search.

Here are a few examples/results utilizing different search tools and topics.

FASTsearch

A sample search for systematic reviews related to autism in young children:

If it’s too small to see on your device, the search would look like this on a single line:

autism AND (toddlers OR “young children”) AND “systematic review”

Here’s a screenshot of the results.  Remember, FASTsearch often returns large quantities of results so you can always use the filters on the left side of the screen to better target what you need:

 

Google Scholar

A sample Google Scholar search for literature reviews or meta-analyses related to organizational leadership:

And a screenshot of the results produced. (Pro Tip: connect to Google Scholar through the library to see our ‘Full-Text @ Fielding’ links):

ProQuest Databases

A sample search for literature reviews related to PTSD and veterans in ProQuest’s psychology database:

On a single line: veterans AND PTSD AND “literature review”

Some of the possible results:

We hope these tips will help you up your review game!

Happy Searching!

Linking to Library Resources

Are you tired of going through the library every time you want to connect to Google Scholar?

Wish those links in your FASTsearch emails always connected back to resources?

Love PsycINFO and wish you could just go straight to it when it’s time to research?

The Fielding library is pleased to announce that we have enabled a new authentication method that will allow you to link back to library content without first having to go through Moodle or MyFielding!

This new authentication method does not replace the previous method, so you are still able to connect to our library resources however you prefer.

We’ve created a guide which explains the new method in detail, shows you how/where to save links back to our content, and more: http://libguides.fielding.edu/linking. Faculty: be sure to check out the ‘faculty tip’ tab for information on how you can use this method in your syllabi and Moodle pages!

A quick summary:

With this new method, when you follow a link back to a library resource (such as a FASTsearch result), you will first encounter a log-in screen to verify that you are allowed to access the resource:

 

Please note: you must use your MyFielding credentials on the log-in page.

Once you input your credentials, you’ll be connected directly to the resource itself.

Some Suggested Uses:

 

  • Use a citation manager? Check out the ‘citation managers’ tab in the LibGuide above for suggestions on how to integrate our new authentication method with your citation manager.

Since this authentication method is brand new to us, we’re sure there will be some bumps along the road. If you encounter any issues please don’t hesitate to contact the library for assistance. And remember you can always connect to the library via Moodle or MyFielding to access FASTsearch, our databases, and more.

Happy Searching!

Hey, let’s meet!

Greetings Fielding Library Users,

Hasn’t this year flown by?  It’s hard to believe that it’s just about time for another Winter Session.  As always, your friendly librarians will be on-site to provide instruction and reference assistance.

Can’t make it to one of our classes but have a burning library question (or ten)?  We can still help!  Feel free to email the library in advance or drop by the reference desk in the registration area to schedule a time to meet.  We’ll be available for appointments from Thursday-Saturday during SLS week, and from Monday-Wednesday during Psych week.  If for some reason we can’t meet during session, remember that we are always available to schedule a Zoom meeting!

If you’re chomping at the bit for some nerdy library-related news, why not check out Altmetric’s Top 100 of 2016? Altmetric does a lot of work to track how research is talked about in the news and on social media, and each year they bring you the 100 most-discussed articles of the year based on those metrics.  In this year’s list you’ll find:

  • The first academic paper published by a sitting President
  • A work investigating the link between income-level and life-expectancy
  • A study investigating the relationship between cell phone frequencies and cancers in rats
  • And even a work about the relationship between smart phone user and children’s sleep patterns

Plenty of interesting reading to be had!

Have a safe and happy holiday season!  We’ll see you next year!

 

Free Resource Highlight: The American Presidency Project

What with election day right around the corner, your friendly librarians wanted to highlight an excellent, free resource related to all things presidential: The American Presidency Project.

presidents matchbooks3

Image by Jimmie. CC license here.

The American Presidency Project (APP) is a collection of over 100,000 documents hosted by the University of Santa Barbara.  As explained on their website, the APP is the only online resource “that has consolidated, coded, and organized into a single searchable database” this plethora of documentation.

What sorts of things are included?

  • Messages and papers of the Presidents
  • Party platforms
  • Speeches (Inaugural addresses, State of the Union addresses, and more)
  • Public Papers
  • And even election data dating back to 1789!

You can search for, select, or browse the collections therein to find exactly what you’re looking for or to discover something new.

That’s certainly not an exhaustive list; just a sampling of the many types of items you will find archived here. They even provide an audio/video archive for those wanting to listen to one of FDR’s Fireside Chats or view Gerald Ford’s remarks on pardoning Richard Nixon.

Not much of a history buff?  You can even view campaign speeches, press releases, and statements from the current candidates here. Or check out the homepage to see newspaper endorsements of the current candidates and how they stack up to previous elections/candidates.

Whether you’re in the thick of a serious research project or just need a distraction as you bite your nails while the results pour in tomorrow, we hope you’ll enjoy this wonderful resource.

Happy Searching!

TBT: This Donut Might Be Good For You and more!

UPDATE: Last fall we brought you this post about a wonderful little tool from Altmetrics that lets you see how an article is being talked about in social media.  Since then, many of our library resources have integrated Altmetric data into their records making it easier than ever to know what other researchers think of a work in real time.

You can check out the original post below to see how to download the special ‘Bookmarklet’ tool, but we’ll also show you where this data is hiding in Fielding’s library under the ‘Library Updates’ section of the page.

Library Updates:

The easiest place to see Altmetric data in our library is on the FASTsearch results page.  Once you find a result you like, just click on the ‘Preview’ button and you’ll find the Altmetric data (when available) at the end of the preview:

Screenshot of Fastsearch result.

Click image to enlarge.

Just click on ‘See more details’ to look at the expanded view which is described in detail below.

Most of the ProQuest databases now also include the Altmetric donut.  ProQuest records include the donut in the bottom, right-hand corner of the ‘Abstract/Details’ page.  Now on to the original post…

Original Post:

Mmmmmm….who doesn’t love a deliciously bad for you donut?!

Donuts

Image by Dave Crosby. CC license here.

Wouldn’t it be nice if there were another kind of donut entirely?  Something with the visual appeal of a donut, but which would feed you interesting information about academic articles?  Sounds like gibberish, but it’s not.  Enter: Altmetric.

altmetric donut

Image from Altmetric.

Now that we’re hooked and hungry, what exactly is Altmetric?  In the most general sense, it’s a tool meant to bolster or complement citation-based metrics by including data from alternative sources such as social media outlets.

Many of you savvy students know there can be power in analyzing how many times an article has been cited (i.e. looking at citation counts); BUT, how do we begin to consider the ways an article is being cited by or mentioned in non-traditional sources?

We all know that, today, researchers, professionals and the public alike take to Twitter, blogs, Facebook, newspapers, and other media outlets to express thoughts and opinions on every topic under the Sun.

So wouldn’t it be nice to see if an article has been mentioned, for example, on Twitter?  Not only that; wouldn’t it be nice to see how many times an article has been mentioned on Twitter, by whom, and in what context?

Altmetric can track this type of information and presents it as a nice color-coded donut, displaying how many times and through which outlet a work was mentioned. Their simple bookmarklet tool, discussed below, can be used to this end and is completely free.  Each donut will look a bit different depending on the number and types of sources which mention the article (light blue = Twitter; yellow = blogs; etc.).

Pretty cool, right?!

For those who respond ‘heck yeah!’ jump on down to the ‘Altmetric Bookmarklet’ section.  For those who say ‘but why?!’, read on!

But Why….

This is a good question. First, we can probably agree that any article metric, even a citation count, is not necessarily a measure of the article’s impact or quality.  However, we generally accept that it can be a nice way to know which articles seem to be talked about frequently in their field.

The problem? The scholarly publishing cycle is not the fastest mechanism in the world.  You may read an article, cite it in an article you write, then have your work published 2 years later.  But, you may also read the same article, take to your blog or Twitter with ease, and comment on it within an hour of reading it.

Social Media apps

Image by Jason Howie. CC license here.

It’s these latter mentions–the way a work is being immediately engaged with via social media–that can be hard to measure but offer interesting information on the way a paper, idea, or research is being used.  Altmetric is attempting to fill this gap by tracking how articles are mentioned in these harder-to-measure spaces.

Altmetric Bookmarklet

*Disclaimer: Fielding does not have an institutional Altmetric subscription; however, the Altmetric Bookmarklet lets you take advantage of much of their data for free!

The ‘bookmarklet’ is basically a little button you add to your browser’s bookmarks bar, like so:

altmetric toolbar

Click image to enlarge.

Altmetric has a webpage which explains everything you need to know about adding the bookmarklet to your browser and using it: http://www.altmetric.com/bookmarklet.php  Or the visual learner may appreciate watching their 45-second getting started video below:

Once you’ve got the button in your toolbar, you can simply click on it (while viewing the webpage for an article) to see if they’ve collected any data:

altmetric example

Click image to enlarge.

Alright, so the Twitter count is kind of interesting, but is that it?  No way!  Click the spot which says ‘click for more details’ to get to the good stuff.

The top of the details page will provide you with some summary information, and, more importantly, the option to sign up to receive alerts any time this article receives additional mentions!

altmetric top detail

Click image to enlarge.

The middle and bottom portions of the detail page allow you to see the demographic breakdown of mentions by type (*note, a ‘twitter demographics’ section, a ‘mendeley readers’ section, etc.).  Particularly interesting is that you can see a geographic breakdown of where these users were, but also a breakdown of whether the ‘tweeter’ was a practitioner, scientist, etc.:

altmetric demographics

Click image to enlarge.

Now, your librarian’s favorite part of all this, seeing these mentions in context!  Back at the top of the screen, you will see that we landed on the ‘Summary’ page.  But, in this example, you will also see a ‘Twitter’ tab.  If this article had other types of mentions, you would also see tabs for things such as ‘Blogs’, ‘News’, and so on.  Clicking on the tab for a particular source will display the actual mention in context:

altmetric twitter

The top of the page will explain the number of tweets, number of users, and potential number of followers who viewed those tweets.  Below, you can see the tweet itself and the way in which the article was mentioned: were they singing praises? Retweeting someone else’s posts? Taking issue with the work?  The details are right there.

A Few Things to Know…

This tool is pretty awesome, but there is always fine print.  You will see on the Altmetric site that they do have a few caveats regarding the bookmarklet, as is the case with any tool.

For one, the tool only works on pages/articles which contain certain types of information, such as a DOI number. This means it functions best with more recent works.

Also, it only works on articles which use certain types of metadata.  This explanation could get tedious and boring, suffice to say that it works well with some stuff and not others.

For library users, it may be worth noting that Altmetric seems to work best on the actual webpage for an article. From your librarian’s tests, it works well on database pages created by the publisher (such as: SAGE, Springer, Taylor & Francis, etc.).  However, it is not the best at reading pages within vendor databases (such as: ProQuest or EBSCO).  If you really want Altmetric data for an article here, it may be best to find the article through the publisher’s website and use the toolbar there.

Bottom Line: There will certainly not be altmetric data available for everything.  In fact, chances are there will be a great number of articles for which there is no data.  Not necessarily because the toolbar doesn’t work, but because not all works are mentioned in social media, the article was published too long ago, etc.

More Examples

It might be interesting to see a few screenshots of other donut examples to give you a sense of how they change.  Enjoy these images below!

This article includes Twitter and Facebook mentions:

donut with facebook and twitter

Click image to enlarge.

This article, from Nature, shows a lovely donut representing mentions via various formats like news outlets, blogs, Twitter, Google +, and more!

donut with many mention types

Click image to enlarge.

We hope you enjoy exploring this new tool!  Happy Searching!

Datasearch from Elsevier

To put it plainly: it can be a pain to find open source data.  It can feel like a rather slow-going, tedious process to endlessly skim through articles and databases just trying to locate one table or data-set….

4 Easy Ways to Speed Up Your PC

Image by li kelly. CC license here.

So what do you do when you want to get your hands and eyes on the data quickly?

Lucky for us, the good folks at Elsevier are working on a new tool: Elsevier Datasearch.  As they explain on their FAQs page: “We are interested in exploring what a search engine for research data would look like (as opposed to a web search engine or a document search engine), and are talking with users and data providers about their needs and interests.”  You can also view the FAQs page to learn more about what type of content is indexed and from which sources.

Now, let’s get some of the fine print out of the way: this tool is being actively developed, it’s in the Beta stage, so this may not be the final product.  However, it is available now and Elsevier would love your feedback if you use it.

Sooo0….how does it work?

When you connect to DataSearch, you will find a familiar search engine-like interface:

ds_home

Next, just as you would another search engine, input some keywords related to your research interest(s). Note that, even though the tool is created by Elsevier, content across domains and subjects is indexed here. For example, I tried searching for data related to “income disparity”:

ds_search

Let’s break down the different features available on the results page (woohoo!):

ds_results

  • ‘Types’: This first filter allows you to refine your search by, of course, the type of data available.  This is useful if you are specifically looking for raw data files, or an image to help you represent a claim, and so on.
  • ‘Sources’: This filter allows you to refine by the actual data source.  While you may want use this as a refinement, this filter also just provides a nice snapshot of where the majority of your results come from.
  • ‘Date’: Like any good search tool, DataSearch also lets you filter results by date.  This is particularly useful for researchers needing data that reflects a given time period.

In addition to those filters, there are a few other things to be aware of on the results page.  First, your total number of results is listed just below the main search box.  Second, the types of data associated with each record are listed just below the description.  This is a helpful way to quickly see if the record will provide the data in the format you are seeking.

Now, let’s dive deeper and look at an individual record (oohs and aahs):

ds_record

In this case, I chose a record which had ‘Tabular Data’ available.  Once I click on the record, an expanded menu becomes available to me.  From here, I can use the options on the left side of the record to look through the data (e.g.: Description, Tables 1, 2, 3, and A1).

When I select a table I am then able to view all of the associated data.  I can also select to ‘Go to data source’ at the bottom of the record to learn more about the article in which this data was originally published.

**NOTE: DataSearch is still in development and is not integrated with Fielding’s library resources.  If you follow the ‘Go to data source’ button, you will be redirected to the source on the open web.  Remember to use your stellar search abilities to check for access within our library.**

Keep in mind:

  • Each record will look a bit different depending on the types of data available
  • This resource is in development so there will certainly be minor errors or glitches. Be sure to use the options to provide feedback to Elsevier so they can make improvements.

Happy Searching!

 

Get close…but not too close…

As we’ve mentioned in other help resources, there’s a little trick called ‘phrase searching’ which allows you to find multiple search terms together in a specific order by enclosing them in quotation marks.  While that’s a great strategy, what do you do when you need to find your terms close together but not in any particular order? Well, as you might have guessed, there’s a trick for that too: proximity searching. (Yaaaaay!)

satire-nerd-alert

Image by ebbmart. CC license here.

(Now that that’s out of the way….) This is a particularly useful strategy when you’re searching for a concept that can be expressed in a number of ways.

For example, let’s say you’re looking for sample dissertations in which the author developed their own testing instrument. You hop on over to the dissertation database, highlight the search box, and then….wait…how do you search for this?

Searching for the phrase “instrument development” is too specific and relies too heavily on other researchers using the same phrase as you.  But searching for ‘development’ AND ‘instrument’ is too broad and you know you’ll be inundated with results to sift through.  This is the perfect situation for the proximity search!

So, how do you do it?

Proximity searching will require using some particularly strange syntax.  Essentially, you need to use a symbol (typically the letter ‘N’) to tell the database you want to find your search terms ‘near’ each other. And you also need to include a number which represents how many terms apart you want to find your search terms.

The syntax, then, ends up looking something like: ‘Search term 1’ N/# ‘Search term 2’

What?  I know, it sounds strange, so let’s go a bit farther with our example. When you think about it, there are several ways a researcher might describe developing an instrument:

“the development of an instrument”

“for this study an instrument was developed”

“the process of developing the instrument”

I can see that the terms ‘development’ and ‘instrument’ are usually no more than 2 or 3 terms apart. Also, I have noticed that it’s likely people will use different endings of the word ‘develop’. So in the ProQuest Dissertations database I would build my proximity search like so:

instrument N/3 develop* (forgot what the asterisk does? Check out our post on truncation here.)

proximity1

Some snippets of results returned from this search include:

  • “the purpose of this study was to develop an instrument for the assessment of”
  • “the instrument developed consists of behavioral descriptors”
  • “I developed an instrument to assess whether”

Prefer to see this tip in action?  Check out our quick tip proximity searching video:

The fine print…as per usual…is that different databases may use different syntax/symbols for a proximity search.  I showed the ‘N/#’ example as this is used by all ProQuest databases.  Should you ever try to run a proximity search and find it does not work as you expected, consult the help guide for the tool you’re using to see their preferred syntax.

Happy Searching!

Library Hack — emailing links from FASTsearch

For today’s post, I wanted to show you a useful library ‘hack’ to help you better access items you’ve emailed yourself from FASTsearch.

As you’ve likely seen, when you email yourself a list of citations from the temporary save folder in FASTsearch, all you really get is a bare-bones citation and a link:

Fastsearch email screenshot

Click image to enlarge.

While it’s great to have the citation, sometimes following the links to get back to full-text can cause a bit of frustration…

Broken

Image by Quinn Dombrowski. CC license here.

Okay, hopefully they don’t lead to a broken computer, but you may have noticed that sometimes you follow a link only to be greeted with our proxy server’s error message:

Screenshot of proxy error message

Click image to enlarge

Why is this happening?

Well, our proxy server needs to know that you’re someone who is authorized to access our resources before it will let you do so.  When you follow a saved/bookmarked URL from an email (or wherever), the proxy server has no idea who you are, so instead of redirecting you to the page in the link it displays the error message.

What can you do?

While I haven’t found a perfect fix, I can share the most reliable work-around I’ve found so far.  (Of course, other than this work-around, you can always re-find the item in FASTsearch, use Google Scholar through the library, or connect to the larger publication via the ‘Journal and Book Title’ look-up.)

Caveat: It’s important to remember that this is just a hack/work-around so there is no way to guarantee it will work in every instance, with every computer, or with every browser.  There are many variables at play–but this is something worth trying.

Alright, step one, connect to the library homepage:

libraryhomepage

 

Step two: copy and paste the URL from your emailed FASTsearch result into the address bar, on top of the library’s URL:

homepagepaste

 

And that’s it.  Ideally, when you input it this way it will take you straight to the article in its database:

URLredirect

Click image to enlarge.

Just remember to connect first to the library homepage, copy the link, and paste it on top of the homepage URL. As Emeril would say, ‘Bam!’

Other Details

I’ve tested this method out in a few different scenarios.  While I find that it tends to work with both the Firefox and Chrome browsers, I haven’t had as much success with it using Internet Explorer. Also, I should note that I operate on a PC, so I would love to hear Mac users’ experiences to find out if it works the same, or if it works with Safari.

Happy searching, and copying/pasting!